Mar 07 21:12:29.133435 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 21:12:29.930063 master-0 kubenswrapper[4172]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:12:29.931943 master-0 kubenswrapper[4172]: I0307 21:12:29.931143 4172 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934064 4172 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934113 4172 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934119 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934126 4172 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934132 4172 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934137 4172 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934143 4172 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934149 4172 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934156 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934163 4172 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934168 4172 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:12:29.934140 master-0 kubenswrapper[4172]: W0307 21:12:29.934174 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934179 4172 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934184 4172 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934200 4172 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934205 4172 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934212 4172 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934217 4172 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934221 4172 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934226 4172 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934230 4172 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934235 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934239 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934244 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934249 4172 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934253 4172 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934258 4172 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934262 4172 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934267 4172 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934273 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:12:29.934806 master-0 kubenswrapper[4172]: W0307 21:12:29.934278 4172 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934282 4172 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934286 4172 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934291 4172 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934298 4172 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934304 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934309 4172 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934315 4172 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934319 4172 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934326 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934331 4172 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934336 4172 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934341 4172 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934345 4172 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934350 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934357 4172 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934362 4172 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934366 4172 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934371 4172 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934376 4172 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:12:29.935616 master-0 kubenswrapper[4172]: W0307 21:12:29.934382 4172 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934386 4172 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934391 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934395 4172 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934399 4172 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934404 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934408 4172 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934415 4172 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934421 4172 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934427 4172 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934432 4172 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934437 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934442 4172 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934446 4172 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934450 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934455 4172 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934459 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934464 4172 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934468 4172 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:12:29.936535 master-0 kubenswrapper[4172]: W0307 21:12:29.934473 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: W0307 21:12:29.934477 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: W0307 21:12:29.934482 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935505 4172 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935522 4172 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935532 4172 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935540 4172 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935547 4172 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935554 4172 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935561 4172 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935568 4172 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935573 4172 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935578 4172 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935584 4172 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935589 4172 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935595 4172 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935600 4172 flags.go:64] FLAG: --cgroup-root="" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935605 4172 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935610 4172 flags.go:64] FLAG: --client-ca-file="" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935615 4172 flags.go:64] FLAG: --cloud-config="" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935622 4172 flags.go:64] FLAG: --cloud-provider="" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935628 4172 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935635 4172 flags.go:64] FLAG: --cluster-domain="" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935639 4172 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 21:12:29.937516 master-0 kubenswrapper[4172]: I0307 21:12:29.935646 4172 flags.go:64] FLAG: --config-dir="" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935651 4172 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935658 4172 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935665 4172 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935671 4172 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935702 4172 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935708 4172 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935714 4172 flags.go:64] FLAG: --contention-profiling="false" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935719 4172 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935724 4172 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935732 4172 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935737 4172 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935744 4172 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935749 4172 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935754 4172 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935759 4172 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935764 4172 flags.go:64] FLAG: --enable-server="true" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935768 4172 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935776 4172 flags.go:64] FLAG: --event-burst="100" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935780 4172 flags.go:64] FLAG: --event-qps="50" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935785 4172 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935790 4172 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935794 4172 flags.go:64] FLAG: --eviction-hard="" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935800 4172 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 21:12:29.938798 master-0 kubenswrapper[4172]: I0307 21:12:29.935805 4172 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935809 4172 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935814 4172 flags.go:64] FLAG: --eviction-soft="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935819 4172 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935824 4172 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935830 4172 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935835 4172 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935840 4172 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935845 4172 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935850 4172 flags.go:64] FLAG: --feature-gates="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935856 4172 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935861 4172 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935866 4172 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935871 4172 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935876 4172 flags.go:64] FLAG: --healthz-port="10248" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935881 4172 flags.go:64] FLAG: --help="false" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935887 4172 flags.go:64] FLAG: --hostname-override="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935892 4172 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935898 4172 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935903 4172 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935908 4172 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935913 4172 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935919 4172 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935924 4172 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935929 4172 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 21:12:29.940127 master-0 kubenswrapper[4172]: I0307 21:12:29.935934 4172 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935939 4172 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935944 4172 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935950 4172 flags.go:64] FLAG: --kube-reserved="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935954 4172 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935959 4172 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935964 4172 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935969 4172 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935974 4172 flags.go:64] FLAG: --lock-file="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935980 4172 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935985 4172 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935990 4172 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.935997 4172 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936002 4172 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936006 4172 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936011 4172 flags.go:64] FLAG: --logging-format="text" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936016 4172 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936021 4172 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936026 4172 flags.go:64] FLAG: --manifest-url="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936031 4172 flags.go:64] FLAG: --manifest-url-header="" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936038 4172 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936043 4172 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936049 4172 flags.go:64] FLAG: --max-pods="110" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936054 4172 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936058 4172 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 21:12:29.941442 master-0 kubenswrapper[4172]: I0307 21:12:29.936064 4172 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936069 4172 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936074 4172 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936079 4172 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936085 4172 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936096 4172 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936101 4172 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936106 4172 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936111 4172 flags.go:64] FLAG: --pod-cidr="" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936116 4172 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936124 4172 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936129 4172 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936134 4172 flags.go:64] FLAG: --pods-per-core="0" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936139 4172 flags.go:64] FLAG: --port="10250" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936144 4172 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936148 4172 flags.go:64] FLAG: --provider-id="" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936153 4172 flags.go:64] FLAG: --qos-reserved="" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936158 4172 flags.go:64] FLAG: --read-only-port="10255" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936163 4172 flags.go:64] FLAG: --register-node="true" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936168 4172 flags.go:64] FLAG: --register-schedulable="true" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936174 4172 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936183 4172 flags.go:64] FLAG: --registry-burst="10" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936188 4172 flags.go:64] FLAG: --registry-qps="5" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936193 4172 flags.go:64] FLAG: --reserved-cpus="" Mar 07 21:12:29.943047 master-0 kubenswrapper[4172]: I0307 21:12:29.936198 4172 flags.go:64] FLAG: --reserved-memory="" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936204 4172 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936209 4172 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936215 4172 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936220 4172 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936225 4172 flags.go:64] FLAG: --runonce="false" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936230 4172 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936235 4172 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936241 4172 flags.go:64] FLAG: --seccomp-default="false" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936247 4172 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936252 4172 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936257 4172 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936262 4172 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936276 4172 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936281 4172 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936285 4172 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936289 4172 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936293 4172 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936297 4172 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936301 4172 flags.go:64] FLAG: --system-cgroups="" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936305 4172 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936312 4172 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936316 4172 flags.go:64] FLAG: --tls-cert-file="" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936320 4172 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936326 4172 flags.go:64] FLAG: --tls-min-version="" Mar 07 21:12:29.944171 master-0 kubenswrapper[4172]: I0307 21:12:29.936329 4172 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936334 4172 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936337 4172 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936341 4172 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936345 4172 flags.go:64] FLAG: --v="2" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936351 4172 flags.go:64] FLAG: --version="false" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936357 4172 flags.go:64] FLAG: --vmodule="" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936362 4172 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: I0307 21:12:29.936366 4172 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936473 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936479 4172 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936483 4172 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936487 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936491 4172 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936496 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936501 4172 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936505 4172 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936509 4172 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936512 4172 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936516 4172 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936528 4172 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936531 4172 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:12:29.945311 master-0 kubenswrapper[4172]: W0307 21:12:29.936535 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936540 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936544 4172 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936548 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936552 4172 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936557 4172 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936563 4172 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936569 4172 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936575 4172 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936580 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936585 4172 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936589 4172 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936593 4172 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936598 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936602 4172 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936607 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936613 4172 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936619 4172 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936624 4172 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:12:29.946489 master-0 kubenswrapper[4172]: W0307 21:12:29.936628 4172 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936632 4172 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936635 4172 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936639 4172 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936643 4172 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936647 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936651 4172 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936654 4172 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936658 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936662 4172 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936665 4172 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936671 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936674 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936695 4172 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936699 4172 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936703 4172 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936708 4172 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936712 4172 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936716 4172 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:12:29.947736 master-0 kubenswrapper[4172]: W0307 21:12:29.936720 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936724 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936727 4172 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936732 4172 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936736 4172 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936739 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936743 4172 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936747 4172 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936750 4172 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936754 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936757 4172 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936761 4172 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936764 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936768 4172 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936771 4172 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936774 4172 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936778 4172 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936783 4172 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936786 4172 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936790 4172 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:12:29.948602 master-0 kubenswrapper[4172]: W0307 21:12:29.936794 4172 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:12:29.950044 master-0 kubenswrapper[4172]: I0307 21:12:29.936807 4172 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:12:29.950733 master-0 kubenswrapper[4172]: I0307 21:12:29.950623 4172 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 07 21:12:29.950798 master-0 kubenswrapper[4172]: I0307 21:12:29.950742 4172 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951655 4172 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951748 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951761 4172 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951772 4172 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951800 4172 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951816 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951829 4172 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951840 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951849 4172 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951860 4172 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951870 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951880 4172 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951895 4172 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951909 4172 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951922 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951932 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951957 4172 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951967 4172 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:12:29.952905 master-0 kubenswrapper[4172]: W0307 21:12:29.951978 4172 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.951989 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952001 4172 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952011 4172 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952023 4172 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952035 4172 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952047 4172 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952058 4172 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952069 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952080 4172 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952100 4172 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952114 4172 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952129 4172 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952142 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952153 4172 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952164 4172 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952174 4172 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952185 4172 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952199 4172 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952209 4172 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:12:29.954484 master-0 kubenswrapper[4172]: W0307 21:12:29.952221 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952231 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952250 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952260 4172 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952270 4172 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952283 4172 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952295 4172 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952309 4172 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952319 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952330 4172 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952341 4172 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952352 4172 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952362 4172 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952372 4172 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952383 4172 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952401 4172 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952412 4172 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952422 4172 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952432 4172 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:12:29.955950 master-0 kubenswrapper[4172]: W0307 21:12:29.952443 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952452 4172 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952462 4172 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952471 4172 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952485 4172 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952496 4172 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952506 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952515 4172 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952533 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952543 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952552 4172 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952562 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952572 4172 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952581 4172 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: W0307 21:12:29.952591 4172 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:12:29.957543 master-0 kubenswrapper[4172]: I0307 21:12:29.952612 4172 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953800 4172 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953828 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953839 4172 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953850 4172 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953858 4172 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953867 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953875 4172 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953883 4172 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953895 4172 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953908 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953918 4172 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953927 4172 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953935 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953944 4172 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953952 4172 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953960 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953969 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953977 4172 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953985 4172 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:12:29.958445 master-0 kubenswrapper[4172]: W0307 21:12:29.953993 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954001 4172 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954009 4172 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954017 4172 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954025 4172 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954033 4172 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954041 4172 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954051 4172 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954062 4172 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954100 4172 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954109 4172 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954117 4172 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954126 4172 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954134 4172 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954143 4172 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954150 4172 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954161 4172 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954170 4172 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954178 4172 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:12:29.959533 master-0 kubenswrapper[4172]: W0307 21:12:29.954186 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954194 4172 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954201 4172 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954209 4172 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954217 4172 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954225 4172 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954232 4172 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954240 4172 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954247 4172 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954256 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954263 4172 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954271 4172 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954279 4172 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954286 4172 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954297 4172 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954305 4172 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954314 4172 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954323 4172 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954331 4172 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954339 4172 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:12:29.960845 master-0 kubenswrapper[4172]: W0307 21:12:29.954348 4172 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954355 4172 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954363 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954371 4172 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954378 4172 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954386 4172 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954396 4172 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954406 4172 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954415 4172 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954425 4172 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954434 4172 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954442 4172 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954450 4172 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: W0307 21:12:29.954457 4172 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: I0307 21:12:29.954470 4172 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:12:29.962123 master-0 kubenswrapper[4172]: I0307 21:12:29.955714 4172 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 21:12:29.963024 master-0 kubenswrapper[4172]: I0307 21:12:29.960113 4172 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 07 21:12:29.963024 master-0 kubenswrapper[4172]: I0307 21:12:29.961641 4172 server.go:997] "Starting client certificate rotation" Mar 07 21:12:29.963024 master-0 kubenswrapper[4172]: I0307 21:12:29.961662 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 21:12:29.963024 master-0 kubenswrapper[4172]: I0307 21:12:29.961902 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 21:12:29.989730 master-0 kubenswrapper[4172]: I0307 21:12:29.989587 4172 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:12:29.992852 master-0 kubenswrapper[4172]: I0307 21:12:29.992756 4172 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:12:29.997710 master-0 kubenswrapper[4172]: E0307 21:12:29.997596 4172 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:30.021058 master-0 kubenswrapper[4172]: I0307 21:12:30.020928 4172 log.go:25] "Validated CRI v1 runtime API" Mar 07 21:12:30.028441 master-0 kubenswrapper[4172]: I0307 21:12:30.028350 4172 log.go:25] "Validated CRI v1 image API" Mar 07 21:12:30.031177 master-0 kubenswrapper[4172]: I0307 21:12:30.031128 4172 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 21:12:30.037391 master-0 kubenswrapper[4172]: I0307 21:12:30.037330 4172 fs.go:135] Filesystem UUIDs: map[424f727a-1c86-4a89-859c-7d0acaca7766:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 07 21:12:30.037466 master-0 kubenswrapper[4172]: I0307 21:12:30.037378 4172 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0}] Mar 07 21:12:30.079927 master-0 kubenswrapper[4172]: I0307 21:12:30.078950 4172 manager.go:217] Machine: {Timestamp:2026-03-07 21:12:30.075463031 +0000 UTC m=+0.747880998 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:fd388b0a7ee840b7a9a8619058f28513 SystemUUID:fd388b0a-7ee8-40b7-a9a8-619058f28513 BootID:1e0d9bad-17ce-4467-8d98-7b297ec5d412 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:1450} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:1450} {Name:eth1 MacAddress:fa:16:3e:6e:e9:7d Speed:-1 Mtu:1450} {Name:eth2 MacAddress:fa:16:3e:38:b1:02 Speed:-1 Mtu:1450} {Name:ovs-system MacAddress:e6:01:79:57:35:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 21:12:30.079927 master-0 kubenswrapper[4172]: I0307 21:12:30.079830 4172 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 21:12:30.080254 master-0 kubenswrapper[4172]: I0307 21:12:30.080035 4172 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 21:12:30.080616 master-0 kubenswrapper[4172]: I0307 21:12:30.080573 4172 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 21:12:30.081046 master-0 kubenswrapper[4172]: I0307 21:12:30.080975 4172 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 21:12:30.081400 master-0 kubenswrapper[4172]: I0307 21:12:30.081038 4172 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 21:12:30.081472 master-0 kubenswrapper[4172]: I0307 21:12:30.081423 4172 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 21:12:30.081472 master-0 kubenswrapper[4172]: I0307 21:12:30.081445 4172 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 21:12:30.081472 master-0 kubenswrapper[4172]: I0307 21:12:30.081462 4172 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:12:30.081586 master-0 kubenswrapper[4172]: I0307 21:12:30.081507 4172 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:12:30.082404 master-0 kubenswrapper[4172]: I0307 21:12:30.082361 4172 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:12:30.082555 master-0 kubenswrapper[4172]: I0307 21:12:30.082522 4172 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 21:12:30.086518 master-0 kubenswrapper[4172]: I0307 21:12:30.086462 4172 kubelet.go:418] "Attempting to sync node with API server" Mar 07 21:12:30.086604 master-0 kubenswrapper[4172]: I0307 21:12:30.086522 4172 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 21:12:30.086717 master-0 kubenswrapper[4172]: I0307 21:12:30.086658 4172 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 21:12:30.086717 master-0 kubenswrapper[4172]: I0307 21:12:30.086714 4172 kubelet.go:324] "Adding apiserver pod source" Mar 07 21:12:30.086815 master-0 kubenswrapper[4172]: I0307 21:12:30.086746 4172 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 21:12:30.094103 master-0 kubenswrapper[4172]: I0307 21:12:30.094013 4172 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 07 21:12:30.096020 master-0 kubenswrapper[4172]: W0307 21:12:30.095932 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:30.096101 master-0 kubenswrapper[4172]: E0307 21:12:30.096039 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:30.096101 master-0 kubenswrapper[4172]: W0307 21:12:30.095945 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:30.096173 master-0 kubenswrapper[4172]: E0307 21:12:30.096103 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:30.096329 master-0 kubenswrapper[4172]: I0307 21:12:30.096286 4172 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 21:12:30.096872 master-0 kubenswrapper[4172]: I0307 21:12:30.096817 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 21:12:30.096935 master-0 kubenswrapper[4172]: I0307 21:12:30.096879 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 21:12:30.096935 master-0 kubenswrapper[4172]: I0307 21:12:30.096897 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 21:12:30.096935 master-0 kubenswrapper[4172]: I0307 21:12:30.096912 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 21:12:30.096935 master-0 kubenswrapper[4172]: I0307 21:12:30.096926 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 21:12:30.096935 master-0 kubenswrapper[4172]: I0307 21:12:30.096940 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.096957 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.096971 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.096989 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.097003 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.097023 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 21:12:30.097087 master-0 kubenswrapper[4172]: I0307 21:12:30.097048 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 21:12:30.098144 master-0 kubenswrapper[4172]: I0307 21:12:30.098104 4172 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 21:12:30.099160 master-0 kubenswrapper[4172]: I0307 21:12:30.099116 4172 server.go:1280] "Started kubelet" Mar 07 21:12:30.100646 master-0 kubenswrapper[4172]: I0307 21:12:30.100377 4172 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 21:12:30.100777 master-0 kubenswrapper[4172]: I0307 21:12:30.100384 4172 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 21:12:30.100777 master-0 kubenswrapper[4172]: I0307 21:12:30.100760 4172 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 07 21:12:30.101479 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 07 21:12:30.101773 master-0 kubenswrapper[4172]: I0307 21:12:30.101726 4172 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 21:12:30.102888 master-0 kubenswrapper[4172]: I0307 21:12:30.102731 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:30.105393 master-0 kubenswrapper[4172]: I0307 21:12:30.105345 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 21:12:30.105480 master-0 kubenswrapper[4172]: I0307 21:12:30.105410 4172 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 21:12:30.106229 master-0 kubenswrapper[4172]: I0307 21:12:30.106197 4172 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 21:12:30.106319 master-0 kubenswrapper[4172]: I0307 21:12:30.106305 4172 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 21:12:30.106433 master-0 kubenswrapper[4172]: E0307 21:12:30.106262 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:12:30.106507 master-0 kubenswrapper[4172]: I0307 21:12:30.106480 4172 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 07 21:12:30.107761 master-0 kubenswrapper[4172]: I0307 21:12:30.107727 4172 reconstruct.go:97] "Volume reconstruction finished" Mar 07 21:12:30.107845 master-0 kubenswrapper[4172]: W0307 21:12:30.107663 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:30.107891 master-0 kubenswrapper[4172]: E0307 21:12:30.107836 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:30.107891 master-0 kubenswrapper[4172]: I0307 21:12:30.107763 4172 reconciler.go:26] "Reconciler: start to sync state" Mar 07 21:12:30.108335 master-0 kubenswrapper[4172]: I0307 21:12:30.108299 4172 factory.go:55] Registering systemd factory Mar 07 21:12:30.108399 master-0 kubenswrapper[4172]: I0307 21:12:30.108339 4172 factory.go:221] Registration of the systemd container factory successfully Mar 07 21:12:30.108662 master-0 kubenswrapper[4172]: E0307 21:12:30.108605 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 07 21:12:30.112119 master-0 kubenswrapper[4172]: E0307 21:12:30.107070 4172 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189aab7b779147fe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,LastTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:30.112365 master-0 kubenswrapper[4172]: I0307 21:12:30.112232 4172 factory.go:153] Registering CRI-O factory Mar 07 21:12:30.112365 master-0 kubenswrapper[4172]: I0307 21:12:30.112279 4172 factory.go:221] Registration of the crio container factory successfully Mar 07 21:12:30.112448 master-0 kubenswrapper[4172]: I0307 21:12:30.112410 4172 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 21:12:30.112490 master-0 kubenswrapper[4172]: I0307 21:12:30.112448 4172 factory.go:103] Registering Raw factory Mar 07 21:12:30.112490 master-0 kubenswrapper[4172]: I0307 21:12:30.112472 4172 manager.go:1196] Started watching for new ooms in manager Mar 07 21:12:30.112753 master-0 kubenswrapper[4172]: I0307 21:12:30.112675 4172 server.go:449] "Adding debug handlers to kubelet server" Mar 07 21:12:30.114300 master-0 kubenswrapper[4172]: I0307 21:12:30.114258 4172 manager.go:319] Starting recovery of all containers Mar 07 21:12:30.115481 master-0 kubenswrapper[4172]: E0307 21:12:30.115383 4172 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 07 21:12:30.139217 master-0 kubenswrapper[4172]: I0307 21:12:30.139151 4172 manager.go:324] Recovery completed Mar 07 21:12:30.153738 master-0 kubenswrapper[4172]: I0307 21:12:30.153669 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.156231 master-0 kubenswrapper[4172]: I0307 21:12:30.156159 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.156319 master-0 kubenswrapper[4172]: I0307 21:12:30.156237 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.156319 master-0 kubenswrapper[4172]: I0307 21:12:30.156258 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.157405 master-0 kubenswrapper[4172]: I0307 21:12:30.157363 4172 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 21:12:30.157405 master-0 kubenswrapper[4172]: I0307 21:12:30.157387 4172 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 21:12:30.157516 master-0 kubenswrapper[4172]: I0307 21:12:30.157412 4172 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:12:30.193904 master-0 kubenswrapper[4172]: I0307 21:12:30.193724 4172 policy_none.go:49] "None policy: Start" Mar 07 21:12:30.195549 master-0 kubenswrapper[4172]: I0307 21:12:30.195480 4172 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 21:12:30.195600 master-0 kubenswrapper[4172]: I0307 21:12:30.195576 4172 state_mem.go:35] "Initializing new in-memory state store" Mar 07 21:12:30.206578 master-0 kubenswrapper[4172]: E0307 21:12:30.206517 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:12:30.261035 master-0 kubenswrapper[4172]: I0307 21:12:30.260974 4172 manager.go:334] "Starting Device Plugin manager" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.261257 4172 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.261279 4172 server.go:79] "Starting device plugin registration server" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.262427 4172 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.262481 4172 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.262884 4172 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.263044 4172 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.263061 4172 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: E0307 21:12:30.265136 4172 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.276099 4172 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.279099 4172 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.279210 4172 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: I0307 21:12:30.279260 4172 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: E0307 21:12:30.279351 4172 kubelet.go:2359] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: W0307 21:12:30.280978 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:30.290374 master-0 kubenswrapper[4172]: E0307 21:12:30.281127 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:30.309908 master-0 kubenswrapper[4172]: E0307 21:12:30.309818 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 07 21:12:30.363041 master-0 kubenswrapper[4172]: I0307 21:12:30.362943 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.365258 master-0 kubenswrapper[4172]: I0307 21:12:30.365189 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.365258 master-0 kubenswrapper[4172]: I0307 21:12:30.365251 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.365468 master-0 kubenswrapper[4172]: I0307 21:12:30.365271 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.365468 master-0 kubenswrapper[4172]: I0307 21:12:30.365324 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:30.366548 master-0 kubenswrapper[4172]: E0307 21:12:30.366488 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:30.379707 master-0 kubenswrapper[4172]: I0307 21:12:30.379616 4172 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 07 21:12:30.379813 master-0 kubenswrapper[4172]: I0307 21:12:30.379752 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.381101 master-0 kubenswrapper[4172]: I0307 21:12:30.381044 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.381101 master-0 kubenswrapper[4172]: I0307 21:12:30.381093 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.381270 master-0 kubenswrapper[4172]: I0307 21:12:30.381111 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.381270 master-0 kubenswrapper[4172]: I0307 21:12:30.381270 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.381903 master-0 kubenswrapper[4172]: I0307 21:12:30.381848 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.381999 master-0 kubenswrapper[4172]: I0307 21:12:30.381935 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.382359 master-0 kubenswrapper[4172]: I0307 21:12:30.382294 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.382359 master-0 kubenswrapper[4172]: I0307 21:12:30.382361 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.382506 master-0 kubenswrapper[4172]: I0307 21:12:30.382381 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.382635 master-0 kubenswrapper[4172]: I0307 21:12:30.382587 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.382903 master-0 kubenswrapper[4172]: I0307 21:12:30.382832 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.383021 master-0 kubenswrapper[4172]: I0307 21:12:30.382913 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.383317 master-0 kubenswrapper[4172]: I0307 21:12:30.383229 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.383317 master-0 kubenswrapper[4172]: I0307 21:12:30.383299 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.383317 master-0 kubenswrapper[4172]: I0307 21:12:30.383318 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.384040 master-0 kubenswrapper[4172]: I0307 21:12:30.383974 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.384040 master-0 kubenswrapper[4172]: I0307 21:12:30.384036 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.384256 master-0 kubenswrapper[4172]: I0307 21:12:30.384062 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.384256 master-0 kubenswrapper[4172]: I0307 21:12:30.384239 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.384609 master-0 kubenswrapper[4172]: I0307 21:12:30.384394 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.384609 master-0 kubenswrapper[4172]: I0307 21:12:30.384417 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.384609 master-0 kubenswrapper[4172]: I0307 21:12:30.384441 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.384609 master-0 kubenswrapper[4172]: I0307 21:12:30.384462 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.384609 master-0 kubenswrapper[4172]: I0307 21:12:30.384514 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.385401 master-0 kubenswrapper[4172]: I0307 21:12:30.385333 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.385401 master-0 kubenswrapper[4172]: I0307 21:12:30.385401 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.385599 master-0 kubenswrapper[4172]: I0307 21:12:30.385425 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.385599 master-0 kubenswrapper[4172]: I0307 21:12:30.385547 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.385599 master-0 kubenswrapper[4172]: I0307 21:12:30.385579 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.385599 master-0 kubenswrapper[4172]: I0307 21:12:30.385582 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.385925 master-0 kubenswrapper[4172]: I0307 21:12:30.385600 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.385925 master-0 kubenswrapper[4172]: I0307 21:12:30.385754 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.385925 master-0 kubenswrapper[4172]: I0307 21:12:30.385805 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.387220 master-0 kubenswrapper[4172]: I0307 21:12:30.387163 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.387325 master-0 kubenswrapper[4172]: I0307 21:12:30.387229 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.387325 master-0 kubenswrapper[4172]: I0307 21:12:30.387172 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.387325 master-0 kubenswrapper[4172]: I0307 21:12:30.387290 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.387325 master-0 kubenswrapper[4172]: I0307 21:12:30.387255 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.387325 master-0 kubenswrapper[4172]: I0307 21:12:30.387319 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.387729 master-0 kubenswrapper[4172]: I0307 21:12:30.387653 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.387814 master-0 kubenswrapper[4172]: I0307 21:12:30.387748 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.388791 master-0 kubenswrapper[4172]: I0307 21:12:30.388745 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.388872 master-0 kubenswrapper[4172]: I0307 21:12:30.388794 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.388872 master-0 kubenswrapper[4172]: I0307 21:12:30.388811 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.410049 master-0 kubenswrapper[4172]: I0307 21:12:30.409966 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.410216 master-0 kubenswrapper[4172]: I0307 21:12:30.410070 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.410216 master-0 kubenswrapper[4172]: I0307 21:12:30.410174 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.410392 master-0 kubenswrapper[4172]: I0307 21:12:30.410276 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.410487 master-0 kubenswrapper[4172]: I0307 21:12:30.410461 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.410708 master-0 kubenswrapper[4172]: I0307 21:12:30.410580 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.410837 master-0 kubenswrapper[4172]: I0307 21:12:30.410742 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.410937 master-0 kubenswrapper[4172]: I0307 21:12:30.410859 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.411034 master-0 kubenswrapper[4172]: I0307 21:12:30.410955 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.411141 master-0 kubenswrapper[4172]: I0307 21:12:30.411054 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.411228 master-0 kubenswrapper[4172]: I0307 21:12:30.411152 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.411324 master-0 kubenswrapper[4172]: I0307 21:12:30.411247 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.411422 master-0 kubenswrapper[4172]: I0307 21:12:30.411340 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.411517 master-0 kubenswrapper[4172]: I0307 21:12:30.411391 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.411620 master-0 kubenswrapper[4172]: I0307 21:12:30.411487 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.411753 master-0 kubenswrapper[4172]: I0307 21:12:30.411582 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.411753 master-0 kubenswrapper[4172]: I0307 21:12:30.411664 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.512627 master-0 kubenswrapper[4172]: I0307 21:12:30.512524 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.512627 master-0 kubenswrapper[4172]: I0307 21:12:30.512592 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.512994 master-0 kubenswrapper[4172]: I0307 21:12:30.512853 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513108 master-0 kubenswrapper[4172]: I0307 21:12:30.512991 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513108 master-0 kubenswrapper[4172]: I0307 21:12:30.513021 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513108 master-0 kubenswrapper[4172]: I0307 21:12:30.512935 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513365 master-0 kubenswrapper[4172]: I0307 21:12:30.513236 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513461 master-0 kubenswrapper[4172]: I0307 21:12:30.513369 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.513552 master-0 kubenswrapper[4172]: I0307 21:12:30.513443 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.513552 master-0 kubenswrapper[4172]: I0307 21:12:30.513519 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.513840 master-0 kubenswrapper[4172]: I0307 21:12:30.513554 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.513840 master-0 kubenswrapper[4172]: I0307 21:12:30.513712 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.513840 master-0 kubenswrapper[4172]: I0307 21:12:30.513724 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.513840 master-0 kubenswrapper[4172]: I0307 21:12:30.513815 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.513904 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.513916 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.513976 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.514019 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.514054 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.514182 master-0 kubenswrapper[4172]: I0307 21:12:30.514126 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514224 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514259 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514338 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514261 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514367 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514447 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514472 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514517 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514560 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514585 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514602 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.514659 master-0 kubenswrapper[4172]: I0307 21:12:30.514670 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.515460 master-0 kubenswrapper[4172]: I0307 21:12:30.514709 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.515460 master-0 kubenswrapper[4172]: I0307 21:12:30.514794 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.567978 master-0 kubenswrapper[4172]: I0307 21:12:30.567730 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.569537 master-0 kubenswrapper[4172]: I0307 21:12:30.569461 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.569705 master-0 kubenswrapper[4172]: I0307 21:12:30.569548 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.569705 master-0 kubenswrapper[4172]: I0307 21:12:30.569602 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.569833 master-0 kubenswrapper[4172]: I0307 21:12:30.569732 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:30.571171 master-0 kubenswrapper[4172]: E0307 21:12:30.571096 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:30.712115 master-0 kubenswrapper[4172]: E0307 21:12:30.711882 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 07 21:12:30.730573 master-0 kubenswrapper[4172]: I0307 21:12:30.730487 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:30.751386 master-0 kubenswrapper[4172]: I0307 21:12:30.751318 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:12:30.767659 master-0 kubenswrapper[4172]: I0307 21:12:30.767557 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:12:30.787851 master-0 kubenswrapper[4172]: I0307 21:12:30.787765 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:12:30.794728 master-0 kubenswrapper[4172]: I0307 21:12:30.794656 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:30.971939 master-0 kubenswrapper[4172]: I0307 21:12:30.971646 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:30.973702 master-0 kubenswrapper[4172]: I0307 21:12:30.973623 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:30.973767 master-0 kubenswrapper[4172]: I0307 21:12:30.973740 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:30.973813 master-0 kubenswrapper[4172]: I0307 21:12:30.973768 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:30.973898 master-0 kubenswrapper[4172]: I0307 21:12:30.973860 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:30.975627 master-0 kubenswrapper[4172]: E0307 21:12:30.975553 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:31.104869 master-0 kubenswrapper[4172]: I0307 21:12:31.104734 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:31.237971 master-0 kubenswrapper[4172]: W0307 21:12:31.237740 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:31.237971 master-0 kubenswrapper[4172]: E0307 21:12:31.237882 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:31.394226 master-0 kubenswrapper[4172]: W0307 21:12:31.393649 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582 WatchSource:0}: Error finding container b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582: Status 404 returned error can't find the container with id b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582 Mar 07 21:12:31.398254 master-0 kubenswrapper[4172]: W0307 21:12:31.397961 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf78c05e1499b533b83f091333d61f045.slice/crio-a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff WatchSource:0}: Error finding container a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff: Status 404 returned error can't find the container with id a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff Mar 07 21:12:31.401838 master-0 kubenswrapper[4172]: I0307 21:12:31.401657 4172 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:12:31.448608 master-0 kubenswrapper[4172]: W0307 21:12:31.448493 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:31.448608 master-0 kubenswrapper[4172]: E0307 21:12:31.448606 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:31.458605 master-0 kubenswrapper[4172]: W0307 21:12:31.458515 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod354f29997baa583b6238f7de9108ee10.slice/crio-a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8 WatchSource:0}: Error finding container a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8: Status 404 returned error can't find the container with id a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8 Mar 07 21:12:31.486478 master-0 kubenswrapper[4172]: W0307 21:12:31.486398 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9add8df47182fc2eaf8cd78016ebe72.slice/crio-8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61 WatchSource:0}: Error finding container 8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61: Status 404 returned error can't find the container with id 8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61 Mar 07 21:12:31.512400 master-0 kubenswrapper[4172]: W0307 21:12:31.512251 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:31.512576 master-0 kubenswrapper[4172]: E0307 21:12:31.512419 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:31.513136 master-0 kubenswrapper[4172]: E0307 21:12:31.513068 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 07 21:12:31.540479 master-0 kubenswrapper[4172]: W0307 21:12:31.540415 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5f77c8e18b751d90bc0dfe2d4e304050.slice/crio-5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838 WatchSource:0}: Error finding container 5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838: Status 404 returned error can't find the container with id 5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838 Mar 07 21:12:31.767798 master-0 kubenswrapper[4172]: W0307 21:12:31.767564 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:31.767798 master-0 kubenswrapper[4172]: E0307 21:12:31.767649 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:31.776374 master-0 kubenswrapper[4172]: I0307 21:12:31.776313 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:31.779388 master-0 kubenswrapper[4172]: I0307 21:12:31.779335 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:31.779522 master-0 kubenswrapper[4172]: I0307 21:12:31.779415 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:31.779522 master-0 kubenswrapper[4172]: I0307 21:12:31.779442 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:31.779769 master-0 kubenswrapper[4172]: I0307 21:12:31.779549 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:31.780991 master-0 kubenswrapper[4172]: E0307 21:12:31.780898 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:32.059086 master-0 kubenswrapper[4172]: I0307 21:12:32.058948 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 21:12:32.060811 master-0 kubenswrapper[4172]: E0307 21:12:32.060748 4172 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:32.104819 master-0 kubenswrapper[4172]: I0307 21:12:32.104747 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:32.286601 master-0 kubenswrapper[4172]: I0307 21:12:32.286495 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff"} Mar 07 21:12:32.289233 master-0 kubenswrapper[4172]: I0307 21:12:32.289205 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582"} Mar 07 21:12:32.291389 master-0 kubenswrapper[4172]: I0307 21:12:32.291361 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838"} Mar 07 21:12:32.293106 master-0 kubenswrapper[4172]: I0307 21:12:32.293077 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61"} Mar 07 21:12:32.294309 master-0 kubenswrapper[4172]: I0307 21:12:32.294281 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8"} Mar 07 21:12:33.037572 master-0 kubenswrapper[4172]: E0307 21:12:33.037391 4172 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/default/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{master-0.189aab7b779147fe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,LastTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:33.104227 master-0 kubenswrapper[4172]: I0307 21:12:33.104124 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:33.115024 master-0 kubenswrapper[4172]: E0307 21:12:33.114953 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 07 21:12:33.382009 master-0 kubenswrapper[4172]: I0307 21:12:33.381948 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:33.383269 master-0 kubenswrapper[4172]: I0307 21:12:33.383228 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:33.383344 master-0 kubenswrapper[4172]: I0307 21:12:33.383285 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:33.383344 master-0 kubenswrapper[4172]: I0307 21:12:33.383306 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:33.383461 master-0 kubenswrapper[4172]: I0307 21:12:33.383422 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:33.386337 master-0 kubenswrapper[4172]: E0307 21:12:33.386237 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:33.817640 master-0 kubenswrapper[4172]: W0307 21:12:33.817564 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:33.817640 master-0 kubenswrapper[4172]: E0307 21:12:33.817638 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:34.104382 master-0 kubenswrapper[4172]: I0307 21:12:34.104316 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:34.303647 master-0 kubenswrapper[4172]: I0307 21:12:34.303544 4172 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248" exitCode=0 Mar 07 21:12:34.303647 master-0 kubenswrapper[4172]: I0307 21:12:34.303643 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248"} Mar 07 21:12:34.303976 master-0 kubenswrapper[4172]: I0307 21:12:34.303721 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:34.305422 master-0 kubenswrapper[4172]: I0307 21:12:34.305360 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08"} Mar 07 21:12:34.305648 master-0 kubenswrapper[4172]: I0307 21:12:34.305600 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:34.305648 master-0 kubenswrapper[4172]: I0307 21:12:34.305632 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:34.305648 master-0 kubenswrapper[4172]: I0307 21:12:34.305646 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:34.382379 master-0 kubenswrapper[4172]: W0307 21:12:34.382184 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:34.382379 master-0 kubenswrapper[4172]: E0307 21:12:34.382313 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:34.451715 master-0 kubenswrapper[4172]: W0307 21:12:34.451486 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:34.451715 master-0 kubenswrapper[4172]: E0307 21:12:34.451631 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:34.953964 master-0 kubenswrapper[4172]: W0307 21:12:34.953797 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:34.953964 master-0 kubenswrapper[4172]: E0307 21:12:34.953894 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:35.104798 master-0 kubenswrapper[4172]: I0307 21:12:35.104731 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:35.309206 master-0 kubenswrapper[4172]: I0307 21:12:35.309059 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 07 21:12:35.309934 master-0 kubenswrapper[4172]: I0307 21:12:35.309508 4172 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="21d74a455308a6f7ec6eb8f9a894e0052cb22d993b547b268299c952932d9c42" exitCode=1 Mar 07 21:12:35.309934 master-0 kubenswrapper[4172]: I0307 21:12:35.309561 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"21d74a455308a6f7ec6eb8f9a894e0052cb22d993b547b268299c952932d9c42"} Mar 07 21:12:35.309934 master-0 kubenswrapper[4172]: I0307 21:12:35.309663 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:35.310908 master-0 kubenswrapper[4172]: I0307 21:12:35.310852 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:35.310960 master-0 kubenswrapper[4172]: I0307 21:12:35.310919 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:35.310960 master-0 kubenswrapper[4172]: I0307 21:12:35.310933 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:35.311406 master-0 kubenswrapper[4172]: I0307 21:12:35.311374 4172 scope.go:117] "RemoveContainer" containerID="21d74a455308a6f7ec6eb8f9a894e0052cb22d993b547b268299c952932d9c42" Mar 07 21:12:35.312895 master-0 kubenswrapper[4172]: I0307 21:12:35.312868 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe"} Mar 07 21:12:35.312960 master-0 kubenswrapper[4172]: I0307 21:12:35.312936 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:35.313855 master-0 kubenswrapper[4172]: I0307 21:12:35.313813 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:35.313855 master-0 kubenswrapper[4172]: I0307 21:12:35.313842 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:35.313939 master-0 kubenswrapper[4172]: I0307 21:12:35.313860 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:36.104954 master-0 kubenswrapper[4172]: I0307 21:12:36.104892 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:36.316772 master-0 kubenswrapper[4172]: E0307 21:12:36.316653 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 07 21:12:36.318443 master-0 kubenswrapper[4172]: I0307 21:12:36.318339 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 07 21:12:36.319258 master-0 kubenswrapper[4172]: I0307 21:12:36.319202 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/0.log" Mar 07 21:12:36.319890 master-0 kubenswrapper[4172]: I0307 21:12:36.319842 4172 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9" exitCode=1 Mar 07 21:12:36.319977 master-0 kubenswrapper[4172]: I0307 21:12:36.319936 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9"} Mar 07 21:12:36.320012 master-0 kubenswrapper[4172]: I0307 21:12:36.319996 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:36.320012 master-0 kubenswrapper[4172]: I0307 21:12:36.319999 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:36.320078 master-0 kubenswrapper[4172]: I0307 21:12:36.320027 4172 scope.go:117] "RemoveContainer" containerID="21d74a455308a6f7ec6eb8f9a894e0052cb22d993b547b268299c952932d9c42" Mar 07 21:12:36.321457 master-0 kubenswrapper[4172]: I0307 21:12:36.321410 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:36.321513 master-0 kubenswrapper[4172]: I0307 21:12:36.321461 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:36.321513 master-0 kubenswrapper[4172]: I0307 21:12:36.321481 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:36.321573 master-0 kubenswrapper[4172]: I0307 21:12:36.321501 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:36.321573 master-0 kubenswrapper[4172]: I0307 21:12:36.321555 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:36.321573 master-0 kubenswrapper[4172]: I0307 21:12:36.321570 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:36.322067 master-0 kubenswrapper[4172]: I0307 21:12:36.322031 4172 scope.go:117] "RemoveContainer" containerID="384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9" Mar 07 21:12:36.322385 master-0 kubenswrapper[4172]: E0307 21:12:36.322299 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 07 21:12:36.433932 master-0 kubenswrapper[4172]: I0307 21:12:36.433673 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 21:12:36.435459 master-0 kubenswrapper[4172]: E0307 21:12:36.435403 4172 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.sno.openstack.lab:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:36.587144 master-0 kubenswrapper[4172]: I0307 21:12:36.587080 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:36.590537 master-0 kubenswrapper[4172]: I0307 21:12:36.588981 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:36.590537 master-0 kubenswrapper[4172]: I0307 21:12:36.589047 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:36.590537 master-0 kubenswrapper[4172]: I0307 21:12:36.589079 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:36.590537 master-0 kubenswrapper[4172]: I0307 21:12:36.589169 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:36.591806 master-0 kubenswrapper[4172]: E0307 21:12:36.591767 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/nodes\": dial tcp 192.168.32.10:6443: connect: connection refused" node="master-0" Mar 07 21:12:37.105016 master-0 kubenswrapper[4172]: I0307 21:12:37.104894 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:37.322253 master-0 kubenswrapper[4172]: I0307 21:12:37.322142 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:37.323103 master-0 kubenswrapper[4172]: I0307 21:12:37.323047 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:37.323103 master-0 kubenswrapper[4172]: I0307 21:12:37.323093 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:37.323271 master-0 kubenswrapper[4172]: I0307 21:12:37.323110 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:37.323558 master-0 kubenswrapper[4172]: I0307 21:12:37.323511 4172 scope.go:117] "RemoveContainer" containerID="384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9" Mar 07 21:12:37.326326 master-0 kubenswrapper[4172]: E0307 21:12:37.323756 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 07 21:12:37.668095 master-0 kubenswrapper[4172]: W0307 21:12:37.667953 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:37.668095 master-0 kubenswrapper[4172]: E0307 21:12:37.668079 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes?fieldSelector=metadata.name%3Dmaster-0&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:38.104236 master-0 kubenswrapper[4172]: I0307 21:12:38.104131 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:38.197183 master-0 kubenswrapper[4172]: W0307 21:12:38.197029 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:38.197183 master-0 kubenswrapper[4172]: E0307 21:12:38.197133 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.sno.openstack.lab:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:38.327657 master-0 kubenswrapper[4172]: I0307 21:12:38.327535 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 07 21:12:38.523883 master-0 kubenswrapper[4172]: W0307 21:12:38.523802 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:38.524088 master-0 kubenswrapper[4172]: E0307 21:12:38.523912 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.sno.openstack.lab:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 192.168.32.10:6443: connect: connection refused" logger="UnhandledError" Mar 07 21:12:39.105388 master-0 kubenswrapper[4172]: I0307 21:12:39.105005 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.sno.openstack.lab:6443/apis/storage.k8s.io/v1/csinodes/master-0?resourceVersion=0": dial tcp 192.168.32.10:6443: connect: connection refused Mar 07 21:12:39.334396 master-0 kubenswrapper[4172]: I0307 21:12:39.332867 4172 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71" exitCode=0 Mar 07 21:12:39.334396 master-0 kubenswrapper[4172]: I0307 21:12:39.333089 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71"} Mar 07 21:12:39.334396 master-0 kubenswrapper[4172]: I0307 21:12:39.333119 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:39.335087 master-0 kubenswrapper[4172]: I0307 21:12:39.334543 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:39.335087 master-0 kubenswrapper[4172]: I0307 21:12:39.334575 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:39.335087 master-0 kubenswrapper[4172]: I0307 21:12:39.334585 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:39.336033 master-0 kubenswrapper[4172]: I0307 21:12:39.335591 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a"} Mar 07 21:12:39.337162 master-0 kubenswrapper[4172]: I0307 21:12:39.337111 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da"} Mar 07 21:12:39.337258 master-0 kubenswrapper[4172]: I0307 21:12:39.337176 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:39.337934 master-0 kubenswrapper[4172]: I0307 21:12:39.337896 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:39.337934 master-0 kubenswrapper[4172]: I0307 21:12:39.337925 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:39.337934 master-0 kubenswrapper[4172]: I0307 21:12:39.337935 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:39.344591 master-0 kubenswrapper[4172]: I0307 21:12:39.344546 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:39.345172 master-0 kubenswrapper[4172]: I0307 21:12:39.345111 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:39.345172 master-0 kubenswrapper[4172]: I0307 21:12:39.345152 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:39.345172 master-0 kubenswrapper[4172]: I0307 21:12:39.345171 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:40.266285 master-0 kubenswrapper[4172]: E0307 21:12:40.265425 4172 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 07 21:12:40.342133 master-0 kubenswrapper[4172]: I0307 21:12:40.340983 4172 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a" exitCode=1 Mar 07 21:12:40.342133 master-0 kubenswrapper[4172]: I0307 21:12:40.341073 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a"} Mar 07 21:12:40.345536 master-0 kubenswrapper[4172]: I0307 21:12:40.345491 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c"} Mar 07 21:12:40.345631 master-0 kubenswrapper[4172]: I0307 21:12:40.345591 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:40.349733 master-0 kubenswrapper[4172]: I0307 21:12:40.349662 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:40.349808 master-0 kubenswrapper[4172]: I0307 21:12:40.349740 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:40.349808 master-0 kubenswrapper[4172]: I0307 21:12:40.349755 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:41.128135 master-0 kubenswrapper[4172]: I0307 21:12:41.128080 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:41.128135 master-0 kubenswrapper[4172]: W0307 21:12:41.128094 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:41.128135 master-0 kubenswrapper[4172]: E0307 21:12:41.128150 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 21:12:42.110140 master-0 kubenswrapper[4172]: I0307 21:12:42.110063 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:42.351512 master-0 kubenswrapper[4172]: I0307 21:12:42.351442 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7"} Mar 07 21:12:42.351512 master-0 kubenswrapper[4172]: I0307 21:12:42.351520 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:42.352414 master-0 kubenswrapper[4172]: I0307 21:12:42.352385 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:42.352458 master-0 kubenswrapper[4172]: I0307 21:12:42.352415 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:42.352458 master-0 kubenswrapper[4172]: I0307 21:12:42.352426 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:42.352858 master-0 kubenswrapper[4172]: I0307 21:12:42.352757 4172 scope.go:117] "RemoveContainer" containerID="14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a" Mar 07 21:12:42.354822 master-0 kubenswrapper[4172]: I0307 21:12:42.354695 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea"} Mar 07 21:12:42.354822 master-0 kubenswrapper[4172]: I0307 21:12:42.354783 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:42.355846 master-0 kubenswrapper[4172]: I0307 21:12:42.355741 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:42.355846 master-0 kubenswrapper[4172]: I0307 21:12:42.355766 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:42.355846 master-0 kubenswrapper[4172]: I0307 21:12:42.355775 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:42.406605 master-0 kubenswrapper[4172]: I0307 21:12:42.406510 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:42.406605 master-0 kubenswrapper[4172]: I0307 21:12:42.406593 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:42.747892 master-0 kubenswrapper[4172]: E0307 21:12:42.747791 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 21:12:42.992519 master-0 kubenswrapper[4172]: I0307 21:12:42.992408 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:42.994035 master-0 kubenswrapper[4172]: I0307 21:12:42.993969 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:42.994035 master-0 kubenswrapper[4172]: I0307 21:12:42.994028 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:42.994035 master-0 kubenswrapper[4172]: I0307 21:12:42.994046 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:42.994374 master-0 kubenswrapper[4172]: I0307 21:12:42.994115 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:43.004159 master-0 kubenswrapper[4172]: E0307 21:12:43.004081 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 07 21:12:43.044425 master-0 kubenswrapper[4172]: E0307 21:12:43.044208 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b779147fe default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,LastTimestamp:2026-03-07 21:12:30.099048446 +0000 UTC m=+0.771466373,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.052508 master-0 kubenswrapper[4172]: E0307 21:12:43.052224 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.060491 master-0 kubenswrapper[4172]: E0307 21:12:43.060278 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.068111 master-0 kubenswrapper[4172]: E0307 21:12:43.067968 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.074488 master-0 kubenswrapper[4172]: E0307 21:12:43.074359 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b8165f5f1 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.263981553 +0000 UTC m=+0.936399500,LastTimestamp:2026-03-07 21:12:30.263981553 +0000 UTC m=+0.936399500,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.083075 master-0 kubenswrapper[4172]: E0307 21:12:43.082948 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.36522835 +0000 UTC m=+1.037646287,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.089956 master-0 kubenswrapper[4172]: E0307 21:12:43.089823 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.365263482 +0000 UTC m=+1.037681409,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.097146 master-0 kubenswrapper[4172]: E0307 21:12:43.097014 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.365282373 +0000 UTC m=+1.037700310,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.108547 master-0 kubenswrapper[4172]: E0307 21:12:43.108373 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.381073448 +0000 UTC m=+1.053491375,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.109325 master-0 kubenswrapper[4172]: I0307 21:12:43.109271 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:43.113840 master-0 kubenswrapper[4172]: E0307 21:12:43.113714 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.381104529 +0000 UTC m=+1.053522466,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.122463 master-0 kubenswrapper[4172]: E0307 21:12:43.122341 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.38112213 +0000 UTC m=+1.053540057,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.126902 master-0 kubenswrapper[4172]: E0307 21:12:43.126779 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.382338826 +0000 UTC m=+1.054756753,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.132591 master-0 kubenswrapper[4172]: E0307 21:12:43.132440 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.382373527 +0000 UTC m=+1.054791464,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.137488 master-0 kubenswrapper[4172]: E0307 21:12:43.137343 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.382392367 +0000 UTC m=+1.054810304,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.144536 master-0 kubenswrapper[4172]: E0307 21:12:43.144413 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.383272363 +0000 UTC m=+1.055690290,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.149768 master-0 kubenswrapper[4172]: E0307 21:12:43.149590 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.383311804 +0000 UTC m=+1.055729731,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.156599 master-0 kubenswrapper[4172]: E0307 21:12:43.156435 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.383328124 +0000 UTC m=+1.055746061,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.161473 master-0 kubenswrapper[4172]: E0307 21:12:43.161340 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.384013555 +0000 UTC m=+1.056431492,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.166102 master-0 kubenswrapper[4172]: E0307 21:12:43.165982 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.384052126 +0000 UTC m=+1.056470063,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.170910 master-0 kubenswrapper[4172]: E0307 21:12:43.170787 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.384075777 +0000 UTC m=+1.056493714,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.177848 master-0 kubenswrapper[4172]: E0307 21:12:43.177721 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.384423797 +0000 UTC m=+1.056841734,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.183330 master-0 kubenswrapper[4172]: E0307 21:12:43.183186 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.384454328 +0000 UTC m=+1.056872265,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.189646 master-0 kubenswrapper[4172]: E0307 21:12:43.189502 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa65cd\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa65cd default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node master-0 status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156269005 +0000 UTC m=+0.828686932,LastTimestamp:2026-03-07 21:12:30.384471678 +0000 UTC m=+1.056889615,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.195931 master-0 kubenswrapper[4172]: E0307 21:12:43.195782 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7af98b3d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7af98b3d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node master-0 status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156213053 +0000 UTC m=+0.828630990,LastTimestamp:2026-03-07 21:12:30.385375535 +0000 UTC m=+1.057793462,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.202072 master-0 kubenswrapper[4172]: E0307 21:12:43.201941 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"master-0.189aab7b7afa203a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{master-0.189aab7b7afa203a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:master-0,UID:master-0,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node master-0 status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:30.156251194 +0000 UTC m=+0.828669121,LastTimestamp:2026-03-07 21:12:30.385416586 +0000 UTC m=+1.057834523,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.209984 master-0 kubenswrapper[4172]: E0307 21:12:43.209831 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189aab7bc5347a0d kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:31.401589261 +0000 UTC m=+2.074007198,LastTimestamp:2026-03-07 21:12:31.401589261 +0000 UTC m=+2.074007198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.215701 master-0 kubenswrapper[4172]: E0307 21:12:43.215544 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7bc536c5af kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:31.401739695 +0000 UTC m=+2.074157622,LastTimestamp:2026-03-07 21:12:31.401739695 +0000 UTC m=+2.074157622,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.222282 master-0 kubenswrapper[4172]: E0307 21:12:43.222127 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7bc923eb08 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:31.467612936 +0000 UTC m=+2.140030843,LastTimestamp:2026-03-07 21:12:31.467612936 +0000 UTC m=+2.140030843,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.232062 master-0 kubenswrapper[4172]: E0307 21:12:43.231914 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7bca7464c1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:31.489664193 +0000 UTC m=+2.162082100,LastTimestamp:2026-03-07 21:12:31.489664193 +0000 UTC m=+2.162082100,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.238002 master-0 kubenswrapper[4172]: E0307 21:12:43.237864 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7bcdb32bdb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:31.544110043 +0000 UTC m=+2.216527950,LastTimestamp:2026-03-07 21:12:31.544110043 +0000 UTC m=+2.216527950,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.251173 master-0 kubenswrapper[4172]: E0307 21:12:43.250887 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c2b48ef32 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" in 1.624s (1.624s including waiting). Image size: 465086330 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:33.114206002 +0000 UTC m=+3.786623889,LastTimestamp:2026-03-07 21:12:33.114206002 +0000 UTC m=+3.786623889,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.257728 master-0 kubenswrapper[4172]: E0307 21:12:43.257533 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c3b7131c9 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:33.385279945 +0000 UTC m=+4.057697872,LastTimestamp:2026-03-07 21:12:33.385279945 +0000 UTC m=+4.057697872,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.264176 master-0 kubenswrapper[4172]: E0307 21:12:43.264014 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c3c71cb09 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:33.402096393 +0000 UTC m=+4.074514290,LastTimestamp:2026-03-07 21:12:33.402096393 +0000 UTC m=+4.074514290,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.269863 master-0 kubenswrapper[4172]: E0307 21:12:43.269721 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c581bf710 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" in 2.398s (2.398s including waiting). Image size: 529324693 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:33.866233616 +0000 UTC m=+4.538651513,LastTimestamp:2026-03-07 21:12:33.866233616 +0000 UTC m=+4.538651513,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.276544 master-0 kubenswrapper[4172]: E0307 21:12:43.276388 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c65886677 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container: etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.091443831 +0000 UTC m=+4.763861728,LastTimestamp:2026-03-07 21:12:34.091443831 +0000 UTC m=+4.763861728,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.283738 master-0 kubenswrapper[4172]: E0307 21:12:43.283572 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c6707133a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.11652281 +0000 UTC m=+4.788940707,LastTimestamp:2026-03-07 21:12:34.11652281 +0000 UTC m=+4.788940707,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.289794 master-0 kubenswrapper[4172]: E0307 21:12:43.289625 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c673cb8a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.120038568 +0000 UTC m=+4.792456465,LastTimestamp:2026-03-07 21:12:34.120038568 +0000 UTC m=+4.792456465,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.296445 master-0 kubenswrapper[4172]: E0307 21:12:43.296294 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c7282ae50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.309172816 +0000 UTC m=+4.981590733,LastTimestamp:2026-03-07 21:12:34.309172816 +0000 UTC m=+4.981590733,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.302959 master-0 kubenswrapper[4172]: E0307 21:12:43.302820 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c7401f972 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container: etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.334292338 +0000 UTC m=+5.006710255,LastTimestamp:2026-03-07 21:12:34.334292338 +0000 UTC m=+5.006710255,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.309200 master-0 kubenswrapper[4172]: E0307 21:12:43.309038 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aab7c75152e77 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.352328311 +0000 UTC m=+5.024746218,LastTimestamp:2026-03-07 21:12:34.352328311 +0000 UTC m=+5.024746218,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.316474 master-0 kubenswrapper[4172]: E0307 21:12:43.316331 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c805a36ed openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.541401837 +0000 UTC m=+5.213819734,LastTimestamp:2026-03-07 21:12:34.541401837 +0000 UTC m=+5.213819734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.321855 master-0 kubenswrapper[4172]: E0307 21:12:43.321659 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c8109c716 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.552907542 +0000 UTC m=+5.225325439,LastTimestamp:2026-03-07 21:12:34.552907542 +0000 UTC m=+5.225325439,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.328942 master-0 kubenswrapper[4172]: E0307 21:12:43.328790 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c7282ae50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c7282ae50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.309172816 +0000 UTC m=+4.981590733,LastTimestamp:2026-03-07 21:12:35.314707411 +0000 UTC m=+5.987125308,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.334725 master-0 kubenswrapper[4172]: E0307 21:12:43.334533 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c805a36ed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c805a36ed openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.541401837 +0000 UTC m=+5.213819734,LastTimestamp:2026-03-07 21:12:35.697652016 +0000 UTC m=+6.370069933,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.340549 master-0 kubenswrapper[4172]: E0307 21:12:43.340379 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c8109c716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c8109c716 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.552907542 +0000 UTC m=+5.225325439,LastTimestamp:2026-03-07 21:12:35.764176132 +0000 UTC m=+6.436594039,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.347168 master-0 kubenswrapper[4172]: E0307 21:12:43.347017 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7cea7fba95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:36.322245269 +0000 UTC m=+6.994663196,LastTimestamp:2026-03-07 21:12:36.322245269 +0000 UTC m=+6.994663196,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.354390 master-0 kubenswrapper[4172]: E0307 21:12:43.354227 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7cea7fba95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7cea7fba95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:36.322245269 +0000 UTC m=+6.994663196,LastTimestamp:2026-03-07 21:12:37.323721686 +0000 UTC m=+7.996139603,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.361083 master-0 kubenswrapper[4172]: I0307 21:12:43.361028 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:43.361768 master-0 kubenswrapper[4172]: I0307 21:12:43.361722 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:43.362218 master-0 kubenswrapper[4172]: I0307 21:12:43.362157 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51"} Mar 07 21:12:43.362909 master-0 kubenswrapper[4172]: E0307 21:12:43.362534 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189aab7d63468a6b kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 6.946s (6.946s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.348540523 +0000 UTC m=+9.020958420,LastTimestamp:2026-03-07 21:12:38.348540523 +0000 UTC m=+9.020958420,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.363481 master-0 kubenswrapper[4172]: I0307 21:12:43.363435 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:43.363540 master-0 kubenswrapper[4172]: I0307 21:12:43.363482 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:43.363540 master-0 kubenswrapper[4172]: I0307 21:12:43.363502 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:43.364043 master-0 kubenswrapper[4172]: I0307 21:12:43.363984 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:43.364104 master-0 kubenswrapper[4172]: I0307 21:12:43.364054 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:43.364104 master-0 kubenswrapper[4172]: I0307 21:12:43.364073 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:43.370007 master-0 kubenswrapper[4172]: E0307 21:12:43.369883 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d65a88f27 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 6.986s (6.986s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.388518695 +0000 UTC m=+9.060936632,LastTimestamp:2026-03-07 21:12:38.388518695 +0000 UTC m=+9.060936632,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.376970 master-0 kubenswrapper[4172]: E0307 21:12:43.376802 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7d66afd559 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" in 6.861s (6.861s including waiting). Image size: 943837171 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.405772633 +0000 UTC m=+9.078190560,LastTimestamp:2026-03-07 21:12:38.405772633 +0000 UTC m=+9.078190560,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.382081 master-0 kubenswrapper[4172]: E0307 21:12:43.381940 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189aab7d722ab933 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container: kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.598375731 +0000 UTC m=+9.270793658,LastTimestamp:2026-03-07 21:12:38.598375731 +0000 UTC m=+9.270793658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.387289 master-0 kubenswrapper[4172]: E0307 21:12:43.387141 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-scheduler-master-0.189aab7d7312bce1 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-scheduler-master-0,UID:a1a56802af72ce1aac6b5077f1695ac0,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.613581025 +0000 UTC m=+9.285998962,LastTimestamp:2026-03-07 21:12:38.613581025 +0000 UTC m=+9.285998962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.394022 master-0 kubenswrapper[4172]: E0307 21:12:43.393869 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d73bd0534 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.62474066 +0000 UTC m=+9.297158597,LastTimestamp:2026-03-07 21:12:38.62474066 +0000 UTC m=+9.297158597,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.396112 master-0 kubenswrapper[4172]: I0307 21:12:43.396068 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:43.400416 master-0 kubenswrapper[4172]: E0307 21:12:43.400267 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d749801fe kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.639092222 +0000 UTC m=+9.311510149,LastTimestamp:2026-03-07 21:12:38.639092222 +0000 UTC m=+9.311510149,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.407451 master-0 kubenswrapper[4172]: E0307 21:12:43.407291 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d74b32c6f kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.640872559 +0000 UTC m=+9.313290496,LastTimestamp:2026-03-07 21:12:38.640872559 +0000 UTC m=+9.313290496,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.412403 master-0 kubenswrapper[4172]: E0307 21:12:43.412243 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7d74ce927d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container: setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.642668157 +0000 UTC m=+9.315086064,LastTimestamp:2026-03-07 21:12:38.642668157 +0000 UTC m=+9.315086064,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.419136 master-0 kubenswrapper[4172]: E0307 21:12:43.418975 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7d75d6c8f7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.659983607 +0000 UTC m=+9.332401514,LastTimestamp:2026-03-07 21:12:38.659983607 +0000 UTC m=+9.332401514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.426256 master-0 kubenswrapper[4172]: E0307 21:12:43.426090 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7d9ea30ebc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:39.344459452 +0000 UTC m=+10.016877349,LastTimestamp:2026-03-07 21:12:39.344459452 +0000 UTC m=+10.016877349,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.432332 master-0 kubenswrapper[4172]: E0307 21:12:43.432136 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7dab829646 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container: kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:39.56043527 +0000 UTC m=+10.232853167,LastTimestamp:2026-03-07 21:12:39.56043527 +0000 UTC m=+10.232853167,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.438165 master-0 kubenswrapper[4172]: E0307 21:12:43.437997 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7dac41d08b openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:39.572967563 +0000 UTC m=+10.245385490,LastTimestamp:2026-03-07 21:12:39.572967563 +0000 UTC m=+10.245385490,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.443292 master-0 kubenswrapper[4172]: E0307 21:12:43.443176 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7dac56052e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulling,Message:Pulling image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\",Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:39.574291758 +0000 UTC m=+10.246709685,LastTimestamp:2026-03-07 21:12:39.574291758 +0000 UTC m=+10.246709685,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.448316 master-0 kubenswrapper[4172]: E0307 21:12:43.448154 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7e3c3ace65 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\" in 3.347s (3.347s including waiting). Image size: 505242594 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:41.988427365 +0000 UTC m=+12.660845272,LastTimestamp:2026-03-07 21:12:41.988427365 +0000 UTC m=+12.660845272,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.455188 master-0 kubenswrapper[4172]: E0307 21:12:43.455021 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7e3cae2582 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" in 2.421s (2.421s including waiting). Image size: 514980169 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:41.995986306 +0000 UTC m=+12.668404213,LastTimestamp:2026-03-07 21:12:41.995986306 +0000 UTC m=+12.668404213,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.462020 master-0 kubenswrapper[4172]: E0307 21:12:43.461836 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7e4daa9277 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container: cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:42.280964727 +0000 UTC m=+12.953382624,LastTimestamp:2026-03-07 21:12:42.280964727 +0000 UTC m=+12.953382624,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.467197 master-0 kubenswrapper[4172]: E0307 21:12:43.467051 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7e4eac063c kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:42.297837116 +0000 UTC m=+12.970255013,LastTimestamp:2026-03-07 21:12:42.297837116 +0000 UTC m=+12.970255013,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.472395 master-0 kubenswrapper[4172]: E0307 21:12:43.472264 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7e4eaebbec openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container: kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:42.2980147 +0000 UTC m=+12.970432637,LastTimestamp:2026-03-07 21:12:42.2980147 +0000 UTC m=+12.970432637,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.479043 master-0 kubenswrapper[4172]: E0307 21:12:43.478906 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{bootstrap-kube-apiserver-master-0.189aab7e4f5dbde5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:bootstrap-kube-apiserver-master-0,UID:5f77c8e18b751d90bc0dfe2d4e304050,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:42.309484005 +0000 UTC m=+12.981901902,LastTimestamp:2026-03-07 21:12:42.309484005 +0000 UTC m=+12.981901902,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.485627 master-0 kubenswrapper[4172]: E0307 21:12:43.485498 4172 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7e52166f30 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:42.355142448 +0000 UTC m=+13.027560345,LastTimestamp:2026-03-07 21:12:42.355142448 +0000 UTC m=+13.027560345,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.492265 master-0 kubenswrapper[4172]: E0307 21:12:43.492140 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189aab7d73bd0534\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d73bd0534 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container: kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.62474066 +0000 UTC m=+9.297158597,LastTimestamp:2026-03-07 21:12:42.685402682 +0000 UTC m=+13.357820609,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:43.498309 master-0 kubenswrapper[4172]: E0307 21:12:43.498160 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"bootstrap-kube-controller-manager-master-0.189aab7d749801fe\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"kube-system\"" event="&Event{ObjectMeta:{bootstrap-kube-controller-manager-master-0.189aab7d749801fe kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:bootstrap-kube-controller-manager-master-0,UID:f78c05e1499b533b83f091333d61f045,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:38.639092222 +0000 UTC m=+9.311510149,LastTimestamp:2026-03-07 21:12:42.92245111 +0000 UTC m=+13.594869007,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:44.111149 master-0 kubenswrapper[4172]: I0307 21:12:44.110992 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:44.364342 master-0 kubenswrapper[4172]: I0307 21:12:44.364115 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:44.364342 master-0 kubenswrapper[4172]: I0307 21:12:44.364189 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:44.365587 master-0 kubenswrapper[4172]: I0307 21:12:44.365514 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:44.365587 master-0 kubenswrapper[4172]: I0307 21:12:44.365588 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:44.365734 master-0 kubenswrapper[4172]: I0307 21:12:44.365619 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:44.365900 master-0 kubenswrapper[4172]: I0307 21:12:44.365837 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:44.365960 master-0 kubenswrapper[4172]: I0307 21:12:44.365914 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:44.365960 master-0 kubenswrapper[4172]: I0307 21:12:44.365934 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:44.693724 master-0 kubenswrapper[4172]: I0307 21:12:44.693434 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:44.702029 master-0 kubenswrapper[4172]: I0307 21:12:44.701969 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:45.093142 master-0 kubenswrapper[4172]: I0307 21:12:45.092940 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:45.109480 master-0 kubenswrapper[4172]: I0307 21:12:45.109389 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:45.112537 master-0 kubenswrapper[4172]: I0307 21:12:45.112485 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 07 21:12:45.156945 master-0 kubenswrapper[4172]: I0307 21:12:45.156841 4172 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 21:12:45.367432 master-0 kubenswrapper[4172]: I0307 21:12:45.367295 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:45.368529 master-0 kubenswrapper[4172]: I0307 21:12:45.367295 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:45.368943 master-0 kubenswrapper[4172]: I0307 21:12:45.368878 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:45.368943 master-0 kubenswrapper[4172]: I0307 21:12:45.368936 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:45.369165 master-0 kubenswrapper[4172]: I0307 21:12:45.368956 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:45.370250 master-0 kubenswrapper[4172]: I0307 21:12:45.370165 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:45.370355 master-0 kubenswrapper[4172]: I0307 21:12:45.370262 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:45.370355 master-0 kubenswrapper[4172]: I0307 21:12:45.370285 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:45.376605 master-0 kubenswrapper[4172]: I0307 21:12:45.376535 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:12:46.112303 master-0 kubenswrapper[4172]: I0307 21:12:46.112210 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:46.370572 master-0 kubenswrapper[4172]: I0307 21:12:46.370361 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:46.371764 master-0 kubenswrapper[4172]: I0307 21:12:46.371678 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:46.371764 master-0 kubenswrapper[4172]: I0307 21:12:46.371768 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:46.371946 master-0 kubenswrapper[4172]: I0307 21:12:46.371787 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:47.111714 master-0 kubenswrapper[4172]: I0307 21:12:47.111598 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:47.373587 master-0 kubenswrapper[4172]: I0307 21:12:47.373380 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:47.375105 master-0 kubenswrapper[4172]: I0307 21:12:47.375056 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:47.375259 master-0 kubenswrapper[4172]: I0307 21:12:47.375115 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:47.375259 master-0 kubenswrapper[4172]: I0307 21:12:47.375135 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:47.683117 master-0 kubenswrapper[4172]: W0307 21:12:47.683042 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:47.683117 master-0 kubenswrapper[4172]: E0307 21:12:47.683118 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 21:12:48.111187 master-0 kubenswrapper[4172]: I0307 21:12:48.111069 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:48.588491 master-0 kubenswrapper[4172]: W0307 21:12:48.588392 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "master-0" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 07 21:12:48.588491 master-0 kubenswrapper[4172]: E0307 21:12:48.588454 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"master-0\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 21:12:49.108004 master-0 kubenswrapper[4172]: I0307 21:12:49.107927 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:49.658242 master-0 kubenswrapper[4172]: I0307 21:12:49.658147 4172 csr.go:261] certificate signing request csr-m627g is approved, waiting to be issued Mar 07 21:12:49.761128 master-0 kubenswrapper[4172]: E0307 21:12:49.761010 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 21:12:50.004572 master-0 kubenswrapper[4172]: I0307 21:12:50.004316 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:50.006903 master-0 kubenswrapper[4172]: I0307 21:12:50.006817 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:50.006903 master-0 kubenswrapper[4172]: I0307 21:12:50.006891 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:50.007118 master-0 kubenswrapper[4172]: I0307 21:12:50.006932 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:50.007118 master-0 kubenswrapper[4172]: I0307 21:12:50.007001 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:50.015811 master-0 kubenswrapper[4172]: E0307 21:12:50.015744 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 07 21:12:50.111209 master-0 kubenswrapper[4172]: I0307 21:12:50.111069 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:50.265931 master-0 kubenswrapper[4172]: E0307 21:12:50.265672 4172 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 07 21:12:50.608107 master-0 kubenswrapper[4172]: W0307 21:12:50.608059 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 07 21:12:50.608504 master-0 kubenswrapper[4172]: E0307 21:12:50.608471 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 07 21:12:51.111553 master-0 kubenswrapper[4172]: I0307 21:12:51.111445 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:51.154642 master-0 kubenswrapper[4172]: I0307 21:12:51.154453 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:51.155077 master-0 kubenswrapper[4172]: I0307 21:12:51.154840 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:51.156949 master-0 kubenswrapper[4172]: I0307 21:12:51.156861 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:51.157034 master-0 kubenswrapper[4172]: I0307 21:12:51.156972 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:51.157034 master-0 kubenswrapper[4172]: I0307 21:12:51.156991 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:51.172410 master-0 kubenswrapper[4172]: W0307 21:12:51.172326 4172 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 07 21:12:51.172657 master-0 kubenswrapper[4172]: E0307 21:12:51.172424 4172 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 07 21:12:51.280478 master-0 kubenswrapper[4172]: I0307 21:12:51.280354 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:51.282295 master-0 kubenswrapper[4172]: I0307 21:12:51.282228 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:51.282295 master-0 kubenswrapper[4172]: I0307 21:12:51.282275 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:51.282410 master-0 kubenswrapper[4172]: I0307 21:12:51.282365 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:51.282941 master-0 kubenswrapper[4172]: I0307 21:12:51.282912 4172 scope.go:117] "RemoveContainer" containerID="384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9" Mar 07 21:12:51.296850 master-0 kubenswrapper[4172]: E0307 21:12:51.296550 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c7282ae50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c7282ae50 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.309172816 +0000 UTC m=+4.981590733,LastTimestamp:2026-03-07 21:12:51.286556272 +0000 UTC m=+21.958974209,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:51.539245 master-0 kubenswrapper[4172]: E0307 21:12:51.539025 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c805a36ed\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c805a36ed openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container: kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.541401837 +0000 UTC m=+5.213819734,LastTimestamp:2026-03-07 21:12:51.52815804 +0000 UTC m=+22.200575977,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:51.547021 master-0 kubenswrapper[4172]: E0307 21:12:51.546820 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7c8109c716\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7c8109c716 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:34.552907542 +0000 UTC m=+5.225325439,LastTimestamp:2026-03-07 21:12:51.540916059 +0000 UTC m=+22.213333996,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:51.881066 master-0 kubenswrapper[4172]: I0307 21:12:51.880923 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:51.881398 master-0 kubenswrapper[4172]: I0307 21:12:51.881222 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:51.883134 master-0 kubenswrapper[4172]: I0307 21:12:51.883073 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:51.883134 master-0 kubenswrapper[4172]: I0307 21:12:51.883129 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:51.883514 master-0 kubenswrapper[4172]: I0307 21:12:51.883148 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:51.889122 master-0 kubenswrapper[4172]: I0307 21:12:51.889071 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:52.112036 master-0 kubenswrapper[4172]: I0307 21:12:52.111944 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:52.393720 master-0 kubenswrapper[4172]: I0307 21:12:52.393646 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 07 21:12:52.395026 master-0 kubenswrapper[4172]: I0307 21:12:52.394964 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/1.log" Mar 07 21:12:52.395595 master-0 kubenswrapper[4172]: I0307 21:12:52.395565 4172 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" exitCode=1 Mar 07 21:12:52.395788 master-0 kubenswrapper[4172]: I0307 21:12:52.395633 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932"} Mar 07 21:12:52.395919 master-0 kubenswrapper[4172]: I0307 21:12:52.395902 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:52.396095 master-0 kubenswrapper[4172]: I0307 21:12:52.396049 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:52.396095 master-0 kubenswrapper[4172]: I0307 21:12:52.395897 4172 scope.go:117] "RemoveContainer" containerID="384c6b9d7a535581fe9d6a422773aba63ca1322f874f1327bdc2693b3e4d84b9" Mar 07 21:12:52.397178 master-0 kubenswrapper[4172]: I0307 21:12:52.397128 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:52.397401 master-0 kubenswrapper[4172]: I0307 21:12:52.397198 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:52.397401 master-0 kubenswrapper[4172]: I0307 21:12:52.397224 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:52.397401 master-0 kubenswrapper[4172]: I0307 21:12:52.397307 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:52.397401 master-0 kubenswrapper[4172]: I0307 21:12:52.397351 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:52.397401 master-0 kubenswrapper[4172]: I0307 21:12:52.397372 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:52.402889 master-0 kubenswrapper[4172]: I0307 21:12:52.402084 4172 scope.go:117] "RemoveContainer" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" Mar 07 21:12:52.402889 master-0 kubenswrapper[4172]: E0307 21:12:52.402774 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 07 21:12:52.406981 master-0 kubenswrapper[4172]: I0307 21:12:52.406427 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:52.408246 master-0 kubenswrapper[4172]: I0307 21:12:52.408174 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:52.412167 master-0 kubenswrapper[4172]: E0307 21:12:52.411957 4172 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-rbac-proxy-crio-master-0.189aab7cea7fba95\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-master-0.189aab7cea7fba95 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-master-0,UID:e9add8df47182fc2eaf8cd78016ebe72,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:BackOff,Message:Back-off restarting failed container kube-rbac-proxy-crio in pod kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72),Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:12:36.322245269 +0000 UTC m=+6.994663196,LastTimestamp:2026-03-07 21:12:52.402675484 +0000 UTC m=+23.075093431,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:12:52.414497 master-0 kubenswrapper[4172]: I0307 21:12:52.414415 4172 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:53.111479 master-0 kubenswrapper[4172]: I0307 21:12:53.111368 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:53.403790 master-0 kubenswrapper[4172]: I0307 21:12:53.403478 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 07 21:12:53.404674 master-0 kubenswrapper[4172]: I0307 21:12:53.404351 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:53.405760 master-0 kubenswrapper[4172]: I0307 21:12:53.405667 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:53.405860 master-0 kubenswrapper[4172]: I0307 21:12:53.405763 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:53.405860 master-0 kubenswrapper[4172]: I0307 21:12:53.405783 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:53.408664 master-0 kubenswrapper[4172]: I0307 21:12:53.408613 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:12:54.113483 master-0 kubenswrapper[4172]: I0307 21:12:54.113388 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:54.406833 master-0 kubenswrapper[4172]: I0307 21:12:54.406583 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:54.407990 master-0 kubenswrapper[4172]: I0307 21:12:54.407940 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:54.407990 master-0 kubenswrapper[4172]: I0307 21:12:54.407993 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:54.408137 master-0 kubenswrapper[4172]: I0307 21:12:54.408028 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:55.111500 master-0 kubenswrapper[4172]: I0307 21:12:55.111409 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:55.409460 master-0 kubenswrapper[4172]: I0307 21:12:55.409293 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:55.410756 master-0 kubenswrapper[4172]: I0307 21:12:55.410711 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:55.410841 master-0 kubenswrapper[4172]: I0307 21:12:55.410779 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:55.410841 master-0 kubenswrapper[4172]: I0307 21:12:55.410809 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:56.111492 master-0 kubenswrapper[4172]: I0307 21:12:56.111403 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:56.770923 master-0 kubenswrapper[4172]: E0307 21:12:56.770796 4172 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"master-0\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 07 21:12:57.017110 master-0 kubenswrapper[4172]: I0307 21:12:57.016958 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:12:57.019590 master-0 kubenswrapper[4172]: I0307 21:12:57.019530 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:12:57.019746 master-0 kubenswrapper[4172]: I0307 21:12:57.019616 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:12:57.019746 master-0 kubenswrapper[4172]: I0307 21:12:57.019636 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:12:57.020318 master-0 kubenswrapper[4172]: I0307 21:12:57.020273 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:12:57.031286 master-0 kubenswrapper[4172]: E0307 21:12:57.031081 4172 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="master-0" Mar 07 21:12:57.111754 master-0 kubenswrapper[4172]: I0307 21:12:57.111651 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:58.112050 master-0 kubenswrapper[4172]: I0307 21:12:58.111948 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:12:59.109101 master-0 kubenswrapper[4172]: I0307 21:12:59.109010 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:13:00.113131 master-0 kubenswrapper[4172]: I0307 21:13:00.113076 4172 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "master-0" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 07 21:13:00.267127 master-0 kubenswrapper[4172]: E0307 21:13:00.266879 4172 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"master-0\" not found" Mar 07 21:13:00.817978 master-0 kubenswrapper[4172]: I0307 21:13:00.817903 4172 csr.go:257] certificate signing request csr-m627g is issued Mar 07 21:13:00.962814 master-0 kubenswrapper[4172]: I0307 21:13:00.962754 4172 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 07 21:13:01.117134 master-0 kubenswrapper[4172]: I0307 21:13:01.117063 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.134484 master-0 kubenswrapper[4172]: I0307 21:13:01.134409 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.191978 master-0 kubenswrapper[4172]: I0307 21:13:01.191910 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.483759 master-0 kubenswrapper[4172]: I0307 21:13:01.483552 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.483759 master-0 kubenswrapper[4172]: E0307 21:13:01.483648 4172 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 07 21:13:01.512171 master-0 kubenswrapper[4172]: I0307 21:13:01.512112 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.527374 master-0 kubenswrapper[4172]: I0307 21:13:01.527318 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.585759 master-0 kubenswrapper[4172]: I0307 21:13:01.585659 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.820979 master-0 kubenswrapper[4172]: I0307 21:13:01.820709 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 15:34:39.180563697 +0000 UTC Mar 07 21:13:01.820979 master-0 kubenswrapper[4172]: I0307 21:13:01.820789 4172 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 18h21m37.359779218s for next certificate rotation Mar 07 21:13:01.861870 master-0 kubenswrapper[4172]: I0307 21:13:01.861764 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.861870 master-0 kubenswrapper[4172]: E0307 21:13:01.861816 4172 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 07 21:13:01.965958 master-0 kubenswrapper[4172]: I0307 21:13:01.965838 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:01.983167 master-0 kubenswrapper[4172]: I0307 21:13:01.983002 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:02.046367 master-0 kubenswrapper[4172]: I0307 21:13:02.046292 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:02.325560 master-0 kubenswrapper[4172]: I0307 21:13:02.325421 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:02.325560 master-0 kubenswrapper[4172]: E0307 21:13:02.325468 4172 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 07 21:13:02.873794 master-0 kubenswrapper[4172]: I0307 21:13:02.873713 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:02.889618 master-0 kubenswrapper[4172]: I0307 21:13:02.889489 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:02.947319 master-0 kubenswrapper[4172]: I0307 21:13:02.947235 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:03.210960 master-0 kubenswrapper[4172]: I0307 21:13:03.210729 4172 nodeinfomanager.go:401] Failed to publish CSINode: nodes "master-0" not found Mar 07 21:13:03.210960 master-0 kubenswrapper[4172]: E0307 21:13:03.210791 4172 csi_plugin.go:305] Failed to initialize CSINode: error updating CSINode annotation: timed out waiting for the condition; caused by: nodes "master-0" not found Mar 07 21:13:03.781431 master-0 kubenswrapper[4172]: E0307 21:13:03.781366 4172 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"master-0\" not found" node="master-0" Mar 07 21:13:03.856161 master-0 kubenswrapper[4172]: I0307 21:13:03.856089 4172 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 21:13:04.031619 master-0 kubenswrapper[4172]: I0307 21:13:04.031376 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:13:04.033596 master-0 kubenswrapper[4172]: I0307 21:13:04.033510 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:13:04.033596 master-0 kubenswrapper[4172]: I0307 21:13:04.033584 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:13:04.033596 master-0 kubenswrapper[4172]: I0307 21:13:04.033610 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:13:04.033919 master-0 kubenswrapper[4172]: I0307 21:13:04.033737 4172 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:13:04.046763 master-0 kubenswrapper[4172]: I0307 21:13:04.046677 4172 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 07 21:13:04.046763 master-0 kubenswrapper[4172]: E0307 21:13:04.046761 4172 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": node \"master-0\" not found" Mar 07 21:13:04.062302 master-0 kubenswrapper[4172]: E0307 21:13:04.062217 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.127167 master-0 kubenswrapper[4172]: I0307 21:13:04.127065 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 07 21:13:04.142821 master-0 kubenswrapper[4172]: I0307 21:13:04.142746 4172 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 07 21:13:04.162612 master-0 kubenswrapper[4172]: E0307 21:13:04.162519 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.263801 master-0 kubenswrapper[4172]: E0307 21:13:04.263731 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.364223 master-0 kubenswrapper[4172]: E0307 21:13:04.364144 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.464437 master-0 kubenswrapper[4172]: E0307 21:13:04.464351 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.565008 master-0 kubenswrapper[4172]: E0307 21:13:04.564948 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.665828 master-0 kubenswrapper[4172]: E0307 21:13:04.665639 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.765981 master-0 kubenswrapper[4172]: E0307 21:13:04.765877 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.866873 master-0 kubenswrapper[4172]: E0307 21:13:04.866764 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:04.967218 master-0 kubenswrapper[4172]: E0307 21:13:04.966993 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.067894 master-0 kubenswrapper[4172]: E0307 21:13:05.067813 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.168360 master-0 kubenswrapper[4172]: E0307 21:13:05.168268 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.269305 master-0 kubenswrapper[4172]: E0307 21:13:05.269060 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.370413 master-0 kubenswrapper[4172]: E0307 21:13:05.370289 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.471442 master-0 kubenswrapper[4172]: E0307 21:13:05.471348 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.572548 master-0 kubenswrapper[4172]: E0307 21:13:05.572391 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.673503 master-0 kubenswrapper[4172]: E0307 21:13:05.673391 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.773678 master-0 kubenswrapper[4172]: E0307 21:13:05.773591 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.873807 master-0 kubenswrapper[4172]: E0307 21:13:05.873743 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:05.974650 master-0 kubenswrapper[4172]: E0307 21:13:05.974554 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.075785 master-0 kubenswrapper[4172]: E0307 21:13:06.075654 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.176818 master-0 kubenswrapper[4172]: E0307 21:13:06.176630 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.277784 master-0 kubenswrapper[4172]: E0307 21:13:06.277714 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.280181 master-0 kubenswrapper[4172]: I0307 21:13:06.280126 4172 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:13:06.281528 master-0 kubenswrapper[4172]: I0307 21:13:06.281496 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:13:06.281624 master-0 kubenswrapper[4172]: I0307 21:13:06.281537 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:13:06.281624 master-0 kubenswrapper[4172]: I0307 21:13:06.281555 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:13:06.282075 master-0 kubenswrapper[4172]: I0307 21:13:06.282047 4172 scope.go:117] "RemoveContainer" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" Mar 07 21:13:06.282259 master-0 kubenswrapper[4172]: E0307 21:13:06.282225 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-rbac-proxy-crio\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-rbac-proxy-crio pod=kube-rbac-proxy-crio-master-0_openshift-machine-config-operator(e9add8df47182fc2eaf8cd78016ebe72)\"" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podUID="e9add8df47182fc2eaf8cd78016ebe72" Mar 07 21:13:06.378951 master-0 kubenswrapper[4172]: E0307 21:13:06.378877 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.479842 master-0 kubenswrapper[4172]: E0307 21:13:06.479636 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.579896 master-0 kubenswrapper[4172]: E0307 21:13:06.579797 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.681083 master-0 kubenswrapper[4172]: E0307 21:13:06.681002 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.781980 master-0 kubenswrapper[4172]: E0307 21:13:06.781753 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.882794 master-0 kubenswrapper[4172]: E0307 21:13:06.882726 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:06.983604 master-0 kubenswrapper[4172]: E0307 21:13:06.983536 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.084249 master-0 kubenswrapper[4172]: E0307 21:13:07.084097 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.184564 master-0 kubenswrapper[4172]: E0307 21:13:07.184469 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.284960 master-0 kubenswrapper[4172]: E0307 21:13:07.284863 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.385933 master-0 kubenswrapper[4172]: E0307 21:13:07.385795 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.486957 master-0 kubenswrapper[4172]: E0307 21:13:07.486887 4172 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"master-0\" not found" Mar 07 21:13:07.491423 master-0 kubenswrapper[4172]: I0307 21:13:07.491171 4172 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 21:13:08.110043 master-0 kubenswrapper[4172]: I0307 21:13:08.109970 4172 apiserver.go:52] "Watching apiserver" Mar 07 21:13:08.116744 master-0 kubenswrapper[4172]: I0307 21:13:08.116623 4172 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 21:13:08.117085 master-0 kubenswrapper[4172]: I0307 21:13:08.117021 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["assisted-installer/assisted-installer-controller-mqwls","openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4","openshift-network-operator/network-operator-7c649bf6d4-v4xm9"] Mar 07 21:13:08.117573 master-0 kubenswrapper[4172]: I0307 21:13:08.117515 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.117751 master-0 kubenswrapper[4172]: I0307 21:13:08.117583 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.117751 master-0 kubenswrapper[4172]: I0307 21:13:08.117597 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.120643 master-0 kubenswrapper[4172]: I0307 21:13:08.120591 4172 reflector.go:368] Caches populated for *v1.Secret from object-"assisted-installer"/"assisted-installer-controller-secret" Mar 07 21:13:08.120980 master-0 kubenswrapper[4172]: I0307 21:13:08.120842 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 21:13:08.121554 master-0 kubenswrapper[4172]: I0307 21:13:08.121082 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"kube-root-ca.crt" Mar 07 21:13:08.121554 master-0 kubenswrapper[4172]: I0307 21:13:08.121136 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 21:13:08.121554 master-0 kubenswrapper[4172]: I0307 21:13:08.121423 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"openshift-service-ca.crt" Mar 07 21:13:08.121834 master-0 kubenswrapper[4172]: I0307 21:13:08.121802 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 21:13:08.121900 master-0 kubenswrapper[4172]: I0307 21:13:08.121856 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 21:13:08.122579 master-0 kubenswrapper[4172]: I0307 21:13:08.122360 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"assisted-installer"/"assisted-installer-controller-config" Mar 07 21:13:08.122579 master-0 kubenswrapper[4172]: I0307 21:13:08.122455 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 21:13:08.123362 master-0 kubenswrapper[4172]: I0307 21:13:08.123268 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 21:13:08.207577 master-0 kubenswrapper[4172]: I0307 21:13:08.207461 4172 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 07 21:13:08.272414 master-0 kubenswrapper[4172]: I0307 21:13:08.272359 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.272803 master-0 kubenswrapper[4172]: I0307 21:13:08.272756 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.273040 master-0 kubenswrapper[4172]: I0307 21:13:08.273002 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.273258 master-0 kubenswrapper[4172]: I0307 21:13:08.273229 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.273500 master-0 kubenswrapper[4172]: I0307 21:13:08.273460 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.273783 master-0 kubenswrapper[4172]: I0307 21:13:08.273750 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.274013 master-0 kubenswrapper[4172]: I0307 21:13:08.273972 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.274246 master-0 kubenswrapper[4172]: I0307 21:13:08.274214 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.274469 master-0 kubenswrapper[4172]: I0307 21:13:08.274433 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7vlhv\" (UniqueName: \"kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.274734 master-0 kubenswrapper[4172]: I0307 21:13:08.274659 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.274982 master-0 kubenswrapper[4172]: I0307 21:13:08.274944 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.275211 master-0 kubenswrapper[4172]: I0307 21:13:08.275175 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.275432 master-0 kubenswrapper[4172]: I0307 21:13:08.275398 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.376790 master-0 kubenswrapper[4172]: I0307 21:13:08.376564 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.377215 master-0 kubenswrapper[4172]: I0307 21:13:08.376923 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.377215 master-0 kubenswrapper[4172]: I0307 21:13:08.377159 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.377498 master-0 kubenswrapper[4172]: I0307 21:13:08.377467 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.377667 master-0 kubenswrapper[4172]: I0307 21:13:08.377623 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.377926 master-0 kubenswrapper[4172]: I0307 21:13:08.377896 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.378109 master-0 kubenswrapper[4172]: I0307 21:13:08.378020 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.378882 master-0 kubenswrapper[4172]: I0307 21:13:08.378248 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.379983 master-0 kubenswrapper[4172]: I0307 21:13:08.379920 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.380114 master-0 kubenswrapper[4172]: I0307 21:13:08.380031 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.380114 master-0 kubenswrapper[4172]: I0307 21:13:08.380099 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.380289 master-0 kubenswrapper[4172]: I0307 21:13:08.380161 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7vlhv\" (UniqueName: \"kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.380289 master-0 kubenswrapper[4172]: I0307 21:13:08.380224 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.382225 master-0 kubenswrapper[4172]: I0307 21:13:08.382165 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.382364 master-0 kubenswrapper[4172]: I0307 21:13:08.382290 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.383555 master-0 kubenswrapper[4172]: I0307 21:13:08.383493 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.383768 master-0 kubenswrapper[4172]: I0307 21:13:08.380286 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.383885 master-0 kubenswrapper[4172]: I0307 21:13:08.383804 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.383885 master-0 kubenswrapper[4172]: I0307 21:13:08.383852 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.384051 master-0 kubenswrapper[4172]: I0307 21:13:08.383979 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.384151 master-0 kubenswrapper[4172]: E0307 21:13:08.384067 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:08.384242 master-0 kubenswrapper[4172]: I0307 21:13:08.384189 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.384374 master-0 kubenswrapper[4172]: E0307 21:13:08.384329 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:08.88417151 +0000 UTC m=+39.556589437 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:08.384612 master-0 kubenswrapper[4172]: I0307 21:13:08.384548 4172 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 21:13:08.395998 master-0 kubenswrapper[4172]: I0307 21:13:08.395899 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.413255 master-0 kubenswrapper[4172]: I0307 21:13:08.413143 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.417291 master-0 kubenswrapper[4172]: I0307 21:13:08.417229 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.420084 master-0 kubenswrapper[4172]: I0307 21:13:08.420002 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7vlhv\" (UniqueName: \"kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv\") pod \"assisted-installer-controller-mqwls\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.441915 master-0 kubenswrapper[4172]: I0307 21:13:08.441821 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:13:08.474761 master-0 kubenswrapper[4172]: I0307 21:13:08.474662 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:08.492915 master-0 kubenswrapper[4172]: W0307 21:13:08.492851 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfe626e91_8685_417b_b581_ef2dbd9e0ba9.slice/crio-cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a WatchSource:0}: Error finding container cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a: Status 404 returned error can't find the container with id cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a Mar 07 21:13:08.700521 master-0 kubenswrapper[4172]: I0307 21:13:08.700329 4172 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 21:13:08.888900 master-0 kubenswrapper[4172]: I0307 21:13:08.888763 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:08.889287 master-0 kubenswrapper[4172]: E0307 21:13:08.888999 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:08.889287 master-0 kubenswrapper[4172]: E0307 21:13:08.889117 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:09.889083144 +0000 UTC m=+40.561501081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:09.449477 master-0 kubenswrapper[4172]: I0307 21:13:09.449302 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mqwls" event={"ID":"fe626e91-8685-417b-b581-ef2dbd9e0ba9","Type":"ContainerStarted","Data":"cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a"} Mar 07 21:13:09.450944 master-0 kubenswrapper[4172]: I0307 21:13:09.450588 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerStarted","Data":"bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c"} Mar 07 21:13:09.896923 master-0 kubenswrapper[4172]: I0307 21:13:09.896860 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:09.897187 master-0 kubenswrapper[4172]: E0307 21:13:09.897061 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:09.897187 master-0 kubenswrapper[4172]: E0307 21:13:09.897144 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:11.897123465 +0000 UTC m=+42.569541372 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:10.551787 master-0 kubenswrapper[4172]: I0307 21:13:10.551599 4172 csr.go:261] certificate signing request csr-6mfrp is approved, waiting to be issued Mar 07 21:13:10.559120 master-0 kubenswrapper[4172]: I0307 21:13:10.559047 4172 csr.go:257] certificate signing request csr-6mfrp is issued Mar 07 21:13:11.559282 master-0 kubenswrapper[4172]: I0307 21:13:11.559195 4172 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 21:13:11.560719 master-0 kubenswrapper[4172]: I0307 21:13:11.560590 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 14:07:55.157898462 +0000 UTC Mar 07 21:13:11.560719 master-0 kubenswrapper[4172]: I0307 21:13:11.560662 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 16h54m43.597243691s for next certificate rotation Mar 07 21:13:11.914204 master-0 kubenswrapper[4172]: I0307 21:13:11.914118 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:11.914498 master-0 kubenswrapper[4172]: E0307 21:13:11.914343 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:11.914498 master-0 kubenswrapper[4172]: E0307 21:13:11.914456 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:15.914414673 +0000 UTC m=+46.586832610 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:12.561619 master-0 kubenswrapper[4172]: I0307 21:13:12.561509 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 16:18:09.974497347 +0000 UTC Mar 07 21:13:12.561619 master-0 kubenswrapper[4172]: I0307 21:13:12.561558 4172 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 19h4m57.412941674s for next certificate rotation Mar 07 21:13:14.467540 master-0 kubenswrapper[4172]: I0307 21:13:14.467155 4172 generic.go:334] "Generic (PLEG): container finished" podID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerID="5d8a696d04df358a26bc157288f94a3ff4652e100c1ed368a8504d7b4df97ebb" exitCode=0 Mar 07 21:13:14.467540 master-0 kubenswrapper[4172]: I0307 21:13:14.467430 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mqwls" event={"ID":"fe626e91-8685-417b-b581-ef2dbd9e0ba9","Type":"ContainerDied","Data":"5d8a696d04df358a26bc157288f94a3ff4652e100c1ed368a8504d7b4df97ebb"} Mar 07 21:13:14.470744 master-0 kubenswrapper[4172]: I0307 21:13:14.470654 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerStarted","Data":"a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db"} Mar 07 21:13:15.509612 master-0 kubenswrapper[4172]: I0307 21:13:15.509283 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:15.530833 master-0 kubenswrapper[4172]: I0307 21:13:15.530725 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" podStartSLOduration=5.969740225 podStartE2EDuration="11.53066883s" podCreationTimestamp="2026-03-07 21:13:04 +0000 UTC" firstStartedPulling="2026-03-07 21:13:08.467548006 +0000 UTC m=+39.139965943" lastFinishedPulling="2026-03-07 21:13:14.028476611 +0000 UTC m=+44.700894548" observedRunningTime="2026-03-07 21:13:14.502087728 +0000 UTC m=+45.174505665" watchObservedRunningTime="2026-03-07 21:13:15.53066883 +0000 UTC m=+46.203086737" Mar 07 21:13:15.641508 master-0 kubenswrapper[4172]: I0307 21:13:15.641425 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle\") pod \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " Mar 07 21:13:15.641920 master-0 kubenswrapper[4172]: I0307 21:13:15.641900 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf\") pod \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " Mar 07 21:13:15.642045 master-0 kubenswrapper[4172]: I0307 21:13:15.642019 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf\") pod \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " Mar 07 21:13:15.642162 master-0 kubenswrapper[4172]: I0307 21:13:15.641578 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle" (OuterVolumeSpecName: "host-ca-bundle") pod "fe626e91-8685-417b-b581-ef2dbd9e0ba9" (UID: "fe626e91-8685-417b-b581-ef2dbd9e0ba9"). InnerVolumeSpecName "host-ca-bundle". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:13:15.642231 master-0 kubenswrapper[4172]: I0307 21:13:15.642208 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf" (OuterVolumeSpecName: "host-var-run-resolv-conf") pod "fe626e91-8685-417b-b581-ef2dbd9e0ba9" (UID: "fe626e91-8685-417b-b581-ef2dbd9e0ba9"). InnerVolumeSpecName "host-var-run-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:13:15.642281 master-0 kubenswrapper[4172]: I0307 21:13:15.641978 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf" (OuterVolumeSpecName: "host-resolv-conf") pod "fe626e91-8685-417b-b581-ef2dbd9e0ba9" (UID: "fe626e91-8685-417b-b581-ef2dbd9e0ba9"). InnerVolumeSpecName "host-resolv-conf". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:13:15.642281 master-0 kubenswrapper[4172]: I0307 21:13:15.642136 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7vlhv\" (UniqueName: \"kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv\") pod \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " Mar 07 21:13:15.642415 master-0 kubenswrapper[4172]: I0307 21:13:15.642379 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files\") pod \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\" (UID: \"fe626e91-8685-417b-b581-ef2dbd9e0ba9\") " Mar 07 21:13:15.642573 master-0 kubenswrapper[4172]: I0307 21:13:15.642474 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files" (OuterVolumeSpecName: "sno-bootstrap-files") pod "fe626e91-8685-417b-b581-ef2dbd9e0ba9" (UID: "fe626e91-8685-417b-b581-ef2dbd9e0ba9"). InnerVolumeSpecName "sno-bootstrap-files". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:13:15.642646 master-0 kubenswrapper[4172]: I0307 21:13:15.642542 4172 reconciler_common.go:293] "Volume detached for volume \"host-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:15.643478 master-0 kubenswrapper[4172]: I0307 21:13:15.643400 4172 reconciler_common.go:293] "Volume detached for volume \"host-var-run-resolv-conf\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-var-run-resolv-conf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:15.643478 master-0 kubenswrapper[4172]: I0307 21:13:15.643447 4172 reconciler_common.go:293] "Volume detached for volume \"host-ca-bundle\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-host-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:15.649339 master-0 kubenswrapper[4172]: I0307 21:13:15.649236 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv" (OuterVolumeSpecName: "kube-api-access-7vlhv") pod "fe626e91-8685-417b-b581-ef2dbd9e0ba9" (UID: "fe626e91-8685-417b-b581-ef2dbd9e0ba9"). InnerVolumeSpecName "kube-api-access-7vlhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:13:15.744397 master-0 kubenswrapper[4172]: I0307 21:13:15.744243 4172 reconciler_common.go:293] "Volume detached for volume \"sno-bootstrap-files\" (UniqueName: \"kubernetes.io/host-path/fe626e91-8685-417b-b581-ef2dbd9e0ba9-sno-bootstrap-files\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:15.744397 master-0 kubenswrapper[4172]: I0307 21:13:15.744283 4172 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7vlhv\" (UniqueName: \"kubernetes.io/projected/fe626e91-8685-417b-b581-ef2dbd9e0ba9-kube-api-access-7vlhv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:15.946211 master-0 kubenswrapper[4172]: I0307 21:13:15.946134 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:15.946554 master-0 kubenswrapper[4172]: E0307 21:13:15.946379 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:15.946554 master-0 kubenswrapper[4172]: E0307 21:13:15.946513 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:23.946470941 +0000 UTC m=+54.618888868 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:16.482125 master-0 kubenswrapper[4172]: I0307 21:13:16.482040 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="assisted-installer/assisted-installer-controller-mqwls" event={"ID":"fe626e91-8685-417b-b581-ef2dbd9e0ba9","Type":"ContainerDied","Data":"cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a"} Mar 07 21:13:16.482125 master-0 kubenswrapper[4172]: I0307 21:13:16.482121 4172 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a" Mar 07 21:13:16.482125 master-0 kubenswrapper[4172]: I0307 21:13:16.482142 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:13:17.039644 master-0 kubenswrapper[4172]: I0307 21:13:17.039568 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/mtu-prober-82nln"] Mar 07 21:13:17.040278 master-0 kubenswrapper[4172]: E0307 21:13:17.039785 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:13:17.040278 master-0 kubenswrapper[4172]: I0307 21:13:17.039820 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:13:17.040278 master-0 kubenswrapper[4172]: I0307 21:13:17.039913 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:13:17.040278 master-0 kubenswrapper[4172]: I0307 21:13:17.040267 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:17.155296 master-0 kubenswrapper[4172]: I0307 21:13:17.155176 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ldr\" (UniqueName: \"kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr\") pod \"mtu-prober-82nln\" (UID: \"2d827a93-49e5-4694-b119-957cfa9bd648\") " pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:17.256657 master-0 kubenswrapper[4172]: I0307 21:13:17.256537 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ldr\" (UniqueName: \"kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr\") pod \"mtu-prober-82nln\" (UID: \"2d827a93-49e5-4694-b119-957cfa9bd648\") " pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:17.289589 master-0 kubenswrapper[4172]: I0307 21:13:17.289488 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ldr\" (UniqueName: \"kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr\") pod \"mtu-prober-82nln\" (UID: \"2d827a93-49e5-4694-b119-957cfa9bd648\") " pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:17.362231 master-0 kubenswrapper[4172]: I0307 21:13:17.361911 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:17.382174 master-0 kubenswrapper[4172]: W0307 21:13:17.382049 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d827a93_49e5_4694_b119_957cfa9bd648.slice/crio-5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c WatchSource:0}: Error finding container 5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c: Status 404 returned error can't find the container with id 5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c Mar 07 21:13:17.487340 master-0 kubenswrapper[4172]: I0307 21:13:17.487246 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-82nln" event={"ID":"2d827a93-49e5-4694-b119-957cfa9bd648","Type":"ContainerStarted","Data":"5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c"} Mar 07 21:13:18.300065 master-0 kubenswrapper[4172]: I0307 21:13:18.299989 4172 scope.go:117] "RemoveContainer" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" Mar 07 21:13:18.301021 master-0 kubenswrapper[4172]: I0307 21:13:18.300211 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 07 21:13:18.494202 master-0 kubenswrapper[4172]: I0307 21:13:18.494094 4172 generic.go:334] "Generic (PLEG): container finished" podID="2d827a93-49e5-4694-b119-957cfa9bd648" containerID="485cabca7a9edbb9a83d8ef9ee43891f8c296cb8958998f7a4fa97d4fc8e25c3" exitCode=0 Mar 07 21:13:18.494348 master-0 kubenswrapper[4172]: I0307 21:13:18.494205 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-82nln" event={"ID":"2d827a93-49e5-4694-b119-957cfa9bd648","Type":"ContainerDied","Data":"485cabca7a9edbb9a83d8ef9ee43891f8c296cb8958998f7a4fa97d4fc8e25c3"} Mar 07 21:13:19.499986 master-0 kubenswrapper[4172]: I0307 21:13:19.499917 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 07 21:13:19.502076 master-0 kubenswrapper[4172]: I0307 21:13:19.500591 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a056ccba22060bdb53ac003460ab1c7bac5832040445f86cf7efe33efd5a3ab2"} Mar 07 21:13:19.520831 master-0 kubenswrapper[4172]: I0307 21:13:19.520768 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:19.522008 master-0 kubenswrapper[4172]: I0307 21:13:19.521897 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" podStartSLOduration=1.521876549 podStartE2EDuration="1.521876549s" podCreationTimestamp="2026-03-07 21:13:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:13:19.521398166 +0000 UTC m=+50.193816143" watchObservedRunningTime="2026-03-07 21:13:19.521876549 +0000 UTC m=+50.194294486" Mar 07 21:13:19.674762 master-0 kubenswrapper[4172]: I0307 21:13:19.674713 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ldr\" (UniqueName: \"kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr\") pod \"2d827a93-49e5-4694-b119-957cfa9bd648\" (UID: \"2d827a93-49e5-4694-b119-957cfa9bd648\") " Mar 07 21:13:19.682067 master-0 kubenswrapper[4172]: I0307 21:13:19.681897 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr" (OuterVolumeSpecName: "kube-api-access-p6ldr") pod "2d827a93-49e5-4694-b119-957cfa9bd648" (UID: "2d827a93-49e5-4694-b119-957cfa9bd648"). InnerVolumeSpecName "kube-api-access-p6ldr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:13:19.775724 master-0 kubenswrapper[4172]: I0307 21:13:19.775514 4172 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ldr\" (UniqueName: \"kubernetes.io/projected/2d827a93-49e5-4694-b119-957cfa9bd648-kube-api-access-p6ldr\") on node \"master-0\" DevicePath \"\"" Mar 07 21:13:20.505981 master-0 kubenswrapper[4172]: I0307 21:13:20.505859 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/mtu-prober-82nln" event={"ID":"2d827a93-49e5-4694-b119-957cfa9bd648","Type":"ContainerDied","Data":"5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c"} Mar 07 21:13:20.505981 master-0 kubenswrapper[4172]: I0307 21:13:20.505951 4172 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c" Mar 07 21:13:20.505981 master-0 kubenswrapper[4172]: I0307 21:13:20.505897 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/mtu-prober-82nln" Mar 07 21:13:22.048913 master-0 kubenswrapper[4172]: I0307 21:13:22.048847 4172 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-network-operator/mtu-prober-82nln"] Mar 07 21:13:22.053259 master-0 kubenswrapper[4172]: I0307 21:13:22.053193 4172 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-network-operator/mtu-prober-82nln"] Mar 07 21:13:22.286815 master-0 kubenswrapper[4172]: I0307 21:13:22.286750 4172 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" path="/var/lib/kubelet/pods/2d827a93-49e5-4694-b119-957cfa9bd648/volumes" Mar 07 21:13:23.982725 master-0 kubenswrapper[4172]: I0307 21:13:23.982600 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:23.983485 master-0 kubenswrapper[4172]: E0307 21:13:23.983466 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:23.983611 master-0 kubenswrapper[4172]: E0307 21:13:23.983599 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:13:39.983579263 +0000 UTC m=+70.655997150 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:26.932102 master-0 kubenswrapper[4172]: I0307 21:13:26.931839 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-g6nmq"] Mar 07 21:13:26.933348 master-0 kubenswrapper[4172]: E0307 21:13:26.932163 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:13:26.933348 master-0 kubenswrapper[4172]: I0307 21:13:26.932182 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:13:26.933348 master-0 kubenswrapper[4172]: I0307 21:13:26.932214 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:13:26.933348 master-0 kubenswrapper[4172]: I0307 21:13:26.932521 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g6nmq" Mar 07 21:13:26.936032 master-0 kubenswrapper[4172]: I0307 21:13:26.935972 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 21:13:26.936200 master-0 kubenswrapper[4172]: I0307 21:13:26.936131 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 21:13:26.936485 master-0 kubenswrapper[4172]: I0307 21:13:26.936453 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 21:13:26.938046 master-0 kubenswrapper[4172]: I0307 21:13:26.937977 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 21:13:27.009868 master-0 kubenswrapper[4172]: I0307 21:13:27.009766 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.009868 master-0 kubenswrapper[4172]: I0307 21:13:27.009853 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010206 master-0 kubenswrapper[4172]: I0307 21:13:27.009926 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010206 master-0 kubenswrapper[4172]: I0307 21:13:27.009969 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010206 master-0 kubenswrapper[4172]: I0307 21:13:27.010064 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010206 master-0 kubenswrapper[4172]: I0307 21:13:27.010154 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010361 master-0 kubenswrapper[4172]: I0307 21:13:27.010221 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010361 master-0 kubenswrapper[4172]: I0307 21:13:27.010246 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010361 master-0 kubenswrapper[4172]: I0307 21:13:27.010267 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010361 master-0 kubenswrapper[4172]: I0307 21:13:27.010294 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010361 master-0 kubenswrapper[4172]: I0307 21:13:27.010344 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010561 master-0 kubenswrapper[4172]: I0307 21:13:27.010396 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010561 master-0 kubenswrapper[4172]: I0307 21:13:27.010435 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010561 master-0 kubenswrapper[4172]: I0307 21:13:27.010475 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010708 master-0 kubenswrapper[4172]: I0307 21:13:27.010562 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010708 master-0 kubenswrapper[4172]: I0307 21:13:27.010602 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.010708 master-0 kubenswrapper[4172]: I0307 21:13:27.010641 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112261 master-0 kubenswrapper[4172]: I0307 21:13:27.112169 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112261 master-0 kubenswrapper[4172]: I0307 21:13:27.112247 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112753 master-0 kubenswrapper[4172]: I0307 21:13:27.112388 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112753 master-0 kubenswrapper[4172]: I0307 21:13:27.112564 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112753 master-0 kubenswrapper[4172]: I0307 21:13:27.112655 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.112753 master-0 kubenswrapper[4172]: I0307 21:13:27.112725 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112767 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112810 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112883 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112889 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112895 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.112950 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.113001 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.113038 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113125 master-0 kubenswrapper[4172]: I0307 21:13:27.113137 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113163 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113198 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113271 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113288 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113324 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113463 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113510 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113549 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113674 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113740 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113819 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.113870 master-0 kubenswrapper[4172]: I0307 21:13:27.113833 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115039 master-0 kubenswrapper[4172]: I0307 21:13:27.113985 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115039 master-0 kubenswrapper[4172]: I0307 21:13:27.114063 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115039 master-0 kubenswrapper[4172]: I0307 21:13:27.114142 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115039 master-0 kubenswrapper[4172]: I0307 21:13:27.114321 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115468 master-0 kubenswrapper[4172]: I0307 21:13:27.115399 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.115611 master-0 kubenswrapper[4172]: I0307 21:13:27.115560 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.128011 master-0 kubenswrapper[4172]: I0307 21:13:27.127946 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-xf7kg"] Mar 07 21:13:27.128777 master-0 kubenswrapper[4172]: I0307 21:13:27.128732 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.131637 master-0 kubenswrapper[4172]: I0307 21:13:27.131548 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 21:13:27.131844 master-0 kubenswrapper[4172]: I0307 21:13:27.131564 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 07 21:13:27.160194 master-0 kubenswrapper[4172]: I0307 21:13:27.159973 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.215205 master-0 kubenswrapper[4172]: I0307 21:13:27.214851 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215205 master-0 kubenswrapper[4172]: I0307 21:13:27.214961 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215205 master-0 kubenswrapper[4172]: I0307 21:13:27.215085 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215774 master-0 kubenswrapper[4172]: I0307 21:13:27.215206 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215774 master-0 kubenswrapper[4172]: I0307 21:13:27.215482 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215774 master-0 kubenswrapper[4172]: I0307 21:13:27.215626 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215774 master-0 kubenswrapper[4172]: I0307 21:13:27.215730 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.215942 master-0 kubenswrapper[4172]: I0307 21:13:27.215784 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.252292 master-0 kubenswrapper[4172]: I0307 21:13:27.252223 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-g6nmq" Mar 07 21:13:27.273451 master-0 kubenswrapper[4172]: W0307 21:13:27.273351 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb269ae2f_44ff_46c7_9039_21fca4a7a790.slice/crio-6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da WatchSource:0}: Error finding container 6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da: Status 404 returned error can't find the container with id 6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da Mar 07 21:13:27.316910 master-0 kubenswrapper[4172]: I0307 21:13:27.316834 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317004 master-0 kubenswrapper[4172]: I0307 21:13:27.316922 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317004 master-0 kubenswrapper[4172]: I0307 21:13:27.316976 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317422 master-0 kubenswrapper[4172]: I0307 21:13:27.317374 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317422 master-0 kubenswrapper[4172]: I0307 21:13:27.317419 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317529 master-0 kubenswrapper[4172]: I0307 21:13:27.317445 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317592 master-0 kubenswrapper[4172]: I0307 21:13:27.317520 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317663 master-0 kubenswrapper[4172]: I0307 21:13:27.317585 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317742 master-0 kubenswrapper[4172]: I0307 21:13:27.317655 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317742 master-0 kubenswrapper[4172]: I0307 21:13:27.317696 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317742 master-0 kubenswrapper[4172]: I0307 21:13:27.317722 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.317855 master-0 kubenswrapper[4172]: I0307 21:13:27.317744 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.318933 master-0 kubenswrapper[4172]: I0307 21:13:27.318892 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.318933 master-0 kubenswrapper[4172]: I0307 21:13:27.318910 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.319114 master-0 kubenswrapper[4172]: I0307 21:13:27.318953 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.350111 master-0 kubenswrapper[4172]: I0307 21:13:27.349921 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.453078 master-0 kubenswrapper[4172]: I0307 21:13:27.452972 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:13:27.470383 master-0 kubenswrapper[4172]: W0307 21:13:27.470245 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3caff2c1_f178_4e16_916d_27ccf178ff37.slice/crio-9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc WatchSource:0}: Error finding container 9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc: Status 404 returned error can't find the container with id 9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc Mar 07 21:13:27.594870 master-0 kubenswrapper[4172]: I0307 21:13:27.594756 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerStarted","Data":"9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc"} Mar 07 21:13:27.596326 master-0 kubenswrapper[4172]: I0307 21:13:27.596261 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6nmq" event={"ID":"b269ae2f-44ff-46c7-9039-21fca4a7a790","Type":"ContainerStarted","Data":"6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da"} Mar 07 21:13:27.920821 master-0 kubenswrapper[4172]: I0307 21:13:27.920263 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-l2bdp"] Mar 07 21:13:27.921844 master-0 kubenswrapper[4172]: I0307 21:13:27.921809 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:27.922132 master-0 kubenswrapper[4172]: E0307 21:13:27.922089 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:28.023737 master-0 kubenswrapper[4172]: I0307 21:13:28.023595 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.023737 master-0 kubenswrapper[4172]: I0307 21:13:28.023652 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.125011 master-0 kubenswrapper[4172]: I0307 21:13:28.124894 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.125297 master-0 kubenswrapper[4172]: I0307 21:13:28.125031 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.125425 master-0 kubenswrapper[4172]: E0307 21:13:28.125349 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:28.125560 master-0 kubenswrapper[4172]: E0307 21:13:28.125524 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:28.625475212 +0000 UTC m=+59.297893149 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:28.147516 master-0 kubenswrapper[4172]: I0307 21:13:28.147440 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.629460 master-0 kubenswrapper[4172]: I0307 21:13:28.629367 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:28.629797 master-0 kubenswrapper[4172]: E0307 21:13:28.629527 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:28.629797 master-0 kubenswrapper[4172]: E0307 21:13:28.629590 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:29.629572184 +0000 UTC m=+60.301990081 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:29.279949 master-0 kubenswrapper[4172]: I0307 21:13:29.279882 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:29.280985 master-0 kubenswrapper[4172]: E0307 21:13:29.280061 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:29.638374 master-0 kubenswrapper[4172]: I0307 21:13:29.638283 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:29.638831 master-0 kubenswrapper[4172]: E0307 21:13:29.638517 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:29.638831 master-0 kubenswrapper[4172]: E0307 21:13:29.638666 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:31.638636601 +0000 UTC m=+62.311054508 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:31.280199 master-0 kubenswrapper[4172]: I0307 21:13:31.280120 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:31.280935 master-0 kubenswrapper[4172]: E0307 21:13:31.280451 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:31.611193 master-0 kubenswrapper[4172]: I0307 21:13:31.611052 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="7321d4d6e798cb535bde4f9b51f6814dd5e6706005dac86d4315f2c88fc7fa27" exitCode=0 Mar 07 21:13:31.611482 master-0 kubenswrapper[4172]: I0307 21:13:31.611200 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"7321d4d6e798cb535bde4f9b51f6814dd5e6706005dac86d4315f2c88fc7fa27"} Mar 07 21:13:31.656460 master-0 kubenswrapper[4172]: I0307 21:13:31.656277 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:31.656920 master-0 kubenswrapper[4172]: E0307 21:13:31.656537 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:31.656920 master-0 kubenswrapper[4172]: E0307 21:13:31.656670 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:35.656631022 +0000 UTC m=+66.329048929 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:33.281534 master-0 kubenswrapper[4172]: I0307 21:13:33.281459 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:33.282257 master-0 kubenswrapper[4172]: E0307 21:13:33.281643 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:35.280378 master-0 kubenswrapper[4172]: I0307 21:13:35.280158 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:35.281556 master-0 kubenswrapper[4172]: E0307 21:13:35.280424 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:35.690295 master-0 kubenswrapper[4172]: I0307 21:13:35.690193 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:35.690540 master-0 kubenswrapper[4172]: E0307 21:13:35.690470 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:35.690631 master-0 kubenswrapper[4172]: E0307 21:13:35.690599 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:43.69056174 +0000 UTC m=+74.362979667 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:37.279937 master-0 kubenswrapper[4172]: I0307 21:13:37.279858 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:37.281140 master-0 kubenswrapper[4172]: E0307 21:13:37.280039 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:38.632849 master-0 kubenswrapper[4172]: I0307 21:13:38.632732 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="1204b0d0bd4ef37ca4508ca7c0bfef9f1e850dc26e2ddde2b7523df8be7455e3" exitCode=0 Mar 07 21:13:38.632849 master-0 kubenswrapper[4172]: I0307 21:13:38.632828 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"1204b0d0bd4ef37ca4508ca7c0bfef9f1e850dc26e2ddde2b7523df8be7455e3"} Mar 07 21:13:39.279796 master-0 kubenswrapper[4172]: I0307 21:13:39.279644 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:39.280495 master-0 kubenswrapper[4172]: E0307 21:13:39.279945 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:39.320513 master-0 kubenswrapper[4172]: I0307 21:13:39.320441 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k"] Mar 07 21:13:39.320975 master-0 kubenswrapper[4172]: I0307 21:13:39.320940 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.323906 master-0 kubenswrapper[4172]: I0307 21:13:39.323844 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 21:13:39.324481 master-0 kubenswrapper[4172]: I0307 21:13:39.324406 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 21:13:39.324749 master-0 kubenswrapper[4172]: I0307 21:13:39.324707 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 21:13:39.325503 master-0 kubenswrapper[4172]: I0307 21:13:39.325427 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 21:13:39.325917 master-0 kubenswrapper[4172]: I0307 21:13:39.325869 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 21:13:39.425170 master-0 kubenswrapper[4172]: I0307 21:13:39.425059 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.425170 master-0 kubenswrapper[4172]: I0307 21:13:39.425154 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.425574 master-0 kubenswrapper[4172]: I0307 21:13:39.425218 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.425574 master-0 kubenswrapper[4172]: I0307 21:13:39.425257 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.526831 master-0 kubenswrapper[4172]: I0307 21:13:39.526738 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.526831 master-0 kubenswrapper[4172]: I0307 21:13:39.526820 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.526831 master-0 kubenswrapper[4172]: I0307 21:13:39.526848 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.527344 master-0 kubenswrapper[4172]: I0307 21:13:39.527258 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.528204 master-0 kubenswrapper[4172]: I0307 21:13:39.528170 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.528304 master-0 kubenswrapper[4172]: I0307 21:13:39.528226 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.538283 master-0 kubenswrapper[4172]: I0307 21:13:39.538160 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.541835 master-0 kubenswrapper[4172]: I0307 21:13:39.541754 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rqhcv"] Mar 07 21:13:39.542657 master-0 kubenswrapper[4172]: I0307 21:13:39.542476 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.545485 master-0 kubenswrapper[4172]: I0307 21:13:39.544713 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 21:13:39.545485 master-0 kubenswrapper[4172]: I0307 21:13:39.544844 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 21:13:39.551121 master-0 kubenswrapper[4172]: I0307 21:13:39.550260 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.627932 master-0 kubenswrapper[4172]: I0307 21:13:39.627877 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.627932 master-0 kubenswrapper[4172]: I0307 21:13:39.627923 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.627932 master-0 kubenswrapper[4172]: I0307 21:13:39.627948 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628252 master-0 kubenswrapper[4172]: I0307 21:13:39.627971 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628252 master-0 kubenswrapper[4172]: I0307 21:13:39.628114 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628252 master-0 kubenswrapper[4172]: I0307 21:13:39.628188 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628252 master-0 kubenswrapper[4172]: I0307 21:13:39.628215 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628252 master-0 kubenswrapper[4172]: I0307 21:13:39.628246 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628387 master-0 kubenswrapper[4172]: I0307 21:13:39.628294 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628387 master-0 kubenswrapper[4172]: I0307 21:13:39.628339 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628387 master-0 kubenswrapper[4172]: I0307 21:13:39.628368 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628472 master-0 kubenswrapper[4172]: I0307 21:13:39.628395 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628544 master-0 kubenswrapper[4172]: I0307 21:13:39.628468 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628585 master-0 kubenswrapper[4172]: I0307 21:13:39.628555 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628618 master-0 kubenswrapper[4172]: I0307 21:13:39.628584 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grms2\" (UniqueName: \"kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628649 master-0 kubenswrapper[4172]: I0307 21:13:39.628627 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628705 master-0 kubenswrapper[4172]: I0307 21:13:39.628659 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628756 master-0 kubenswrapper[4172]: I0307 21:13:39.628741 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628794 master-0 kubenswrapper[4172]: I0307 21:13:39.628766 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.628794 master-0 kubenswrapper[4172]: I0307 21:13:39.628784 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.645990 master-0 kubenswrapper[4172]: I0307 21:13:39.645932 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:13:39.729673 master-0 kubenswrapper[4172]: I0307 21:13:39.729616 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.729673 master-0 kubenswrapper[4172]: I0307 21:13:39.729670 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.729994 master-0 kubenswrapper[4172]: I0307 21:13:39.729786 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.729994 master-0 kubenswrapper[4172]: I0307 21:13:39.729868 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.729994 master-0 kubenswrapper[4172]: I0307 21:13:39.729910 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730112 master-0 kubenswrapper[4172]: I0307 21:13:39.730033 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730155 master-0 kubenswrapper[4172]: I0307 21:13:39.730108 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grms2\" (UniqueName: \"kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730196 master-0 kubenswrapper[4172]: I0307 21:13:39.730163 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730196 master-0 kubenswrapper[4172]: I0307 21:13:39.730194 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730274 master-0 kubenswrapper[4172]: I0307 21:13:39.730129 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730274 master-0 kubenswrapper[4172]: I0307 21:13:39.730220 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730274 master-0 kubenswrapper[4172]: I0307 21:13:39.730245 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730390 master-0 kubenswrapper[4172]: I0307 21:13:39.730380 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730478 master-0 kubenswrapper[4172]: I0307 21:13:39.730455 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730596 master-0 kubenswrapper[4172]: I0307 21:13:39.730569 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730656 master-0 kubenswrapper[4172]: I0307 21:13:39.730635 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730748 master-0 kubenswrapper[4172]: I0307 21:13:39.730662 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730748 master-0 kubenswrapper[4172]: I0307 21:13:39.730707 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730748 master-0 kubenswrapper[4172]: I0307 21:13:39.730708 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730748 master-0 kubenswrapper[4172]: I0307 21:13:39.730724 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730907 master-0 kubenswrapper[4172]: I0307 21:13:39.730755 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730907 master-0 kubenswrapper[4172]: I0307 21:13:39.730775 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730907 master-0 kubenswrapper[4172]: I0307 21:13:39.730848 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.730907 master-0 kubenswrapper[4172]: I0307 21:13:39.730889 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731059 master-0 kubenswrapper[4172]: I0307 21:13:39.730946 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731059 master-0 kubenswrapper[4172]: I0307 21:13:39.730941 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731059 master-0 kubenswrapper[4172]: I0307 21:13:39.730991 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731059 master-0 kubenswrapper[4172]: I0307 21:13:39.730987 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731059 master-0 kubenswrapper[4172]: I0307 21:13:39.730971 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731259 master-0 kubenswrapper[4172]: I0307 21:13:39.731073 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731259 master-0 kubenswrapper[4172]: I0307 21:13:39.731099 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731259 master-0 kubenswrapper[4172]: I0307 21:13:39.731147 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731259 master-0 kubenswrapper[4172]: I0307 21:13:39.731179 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731259 master-0 kubenswrapper[4172]: I0307 21:13:39.731246 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731510 master-0 kubenswrapper[4172]: I0307 21:13:39.731248 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731560 master-0 kubenswrapper[4172]: I0307 21:13:39.731508 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731797 master-0 kubenswrapper[4172]: I0307 21:13:39.731772 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.731852 master-0 kubenswrapper[4172]: I0307 21:13:39.731825 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.734205 master-0 kubenswrapper[4172]: I0307 21:13:39.734173 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.747833 master-0 kubenswrapper[4172]: I0307 21:13:39.747749 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grms2\" (UniqueName: \"kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2\") pod \"ovnkube-node-rqhcv\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:39.868902 master-0 kubenswrapper[4172]: I0307 21:13:39.868841 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:13:40.034016 master-0 kubenswrapper[4172]: I0307 21:13:40.033945 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:13:40.034301 master-0 kubenswrapper[4172]: E0307 21:13:40.034110 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:40.034301 master-0 kubenswrapper[4172]: E0307 21:13:40.034174 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:12.034155016 +0000 UTC m=+102.706572913 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:13:41.280803 master-0 kubenswrapper[4172]: I0307 21:13:41.280670 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:41.281735 master-0 kubenswrapper[4172]: E0307 21:13:41.280922 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:42.221432 master-0 kubenswrapper[4172]: W0307 21:13:42.221140 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46548c2c_6a8a_4382_87de_2c7a8442a33c.slice/crio-e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0 WatchSource:0}: Error finding container e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0: Status 404 returned error can't find the container with id e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0 Mar 07 21:13:42.222644 master-0 kubenswrapper[4172]: W0307 21:13:42.222572 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8ea3e79_b0a0_4c22_a60f_c1d1d972fc0e.slice/crio-86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db WatchSource:0}: Error finding container 86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db: Status 404 returned error can't find the container with id 86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db Mar 07 21:13:42.293356 master-0 kubenswrapper[4172]: W0307 21:13:42.293268 4172 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 07 21:13:42.294417 master-0 kubenswrapper[4172]: I0307 21:13:42.294354 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 07 21:13:42.510804 master-0 kubenswrapper[4172]: I0307 21:13:42.510494 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-fr4qr"] Mar 07 21:13:42.511917 master-0 kubenswrapper[4172]: I0307 21:13:42.511867 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:42.512019 master-0 kubenswrapper[4172]: E0307 21:13:42.511977 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:42.534462 master-0 kubenswrapper[4172]: I0307 21:13:42.534344 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0-master-0" podStartSLOduration=0.534317938 podStartE2EDuration="534.317938ms" podCreationTimestamp="2026-03-07 21:13:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:13:42.533735434 +0000 UTC m=+73.206153371" watchObservedRunningTime="2026-03-07 21:13:42.534317938 +0000 UTC m=+73.206735835" Mar 07 21:13:42.645811 master-0 kubenswrapper[4172]: I0307 21:13:42.645715 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db"} Mar 07 21:13:42.648271 master-0 kubenswrapper[4172]: I0307 21:13:42.648186 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-g6nmq" event={"ID":"b269ae2f-44ff-46c7-9039-21fca4a7a790","Type":"ContainerStarted","Data":"ba134e0cb39f6e5df7501c7b57b52e06d01f5dbababbe225d9b6ce1c6eae0dc4"} Mar 07 21:13:42.651939 master-0 kubenswrapper[4172]: I0307 21:13:42.651868 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" event={"ID":"46548c2c-6a8a-4382-87de-2c7a8442a33c","Type":"ContainerStarted","Data":"0886274d7fdf47794abf678f1473a3dd108475baa2dd99f1a7c952e9f9cf9001"} Mar 07 21:13:42.651939 master-0 kubenswrapper[4172]: I0307 21:13:42.651922 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" event={"ID":"46548c2c-6a8a-4382-87de-2c7a8442a33c","Type":"ContainerStarted","Data":"e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0"} Mar 07 21:13:42.659288 master-0 kubenswrapper[4172]: I0307 21:13:42.659206 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:42.674733 master-0 kubenswrapper[4172]: I0307 21:13:42.673559 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-g6nmq" podStartSLOduration=1.587454817 podStartE2EDuration="16.673518707s" podCreationTimestamp="2026-03-07 21:13:26 +0000 UTC" firstStartedPulling="2026-03-07 21:13:27.277388185 +0000 UTC m=+57.949806112" lastFinishedPulling="2026-03-07 21:13:42.363452085 +0000 UTC m=+73.035870002" observedRunningTime="2026-03-07 21:13:42.673481796 +0000 UTC m=+73.345899773" watchObservedRunningTime="2026-03-07 21:13:42.673518707 +0000 UTC m=+73.345936634" Mar 07 21:13:42.761392 master-0 kubenswrapper[4172]: I0307 21:13:42.760430 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:42.787613 master-0 kubenswrapper[4172]: E0307 21:13:42.787389 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:42.787613 master-0 kubenswrapper[4172]: E0307 21:13:42.787474 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:42.787613 master-0 kubenswrapper[4172]: E0307 21:13:42.787497 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:42.787613 master-0 kubenswrapper[4172]: E0307 21:13:42.787606 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:13:43.287570429 +0000 UTC m=+73.959988366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:43.280462 master-0 kubenswrapper[4172]: I0307 21:13:43.280413 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:43.280595 master-0 kubenswrapper[4172]: E0307 21:13:43.280545 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:43.366661 master-0 kubenswrapper[4172]: I0307 21:13:43.366591 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:43.367065 master-0 kubenswrapper[4172]: E0307 21:13:43.366823 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:43.367065 master-0 kubenswrapper[4172]: E0307 21:13:43.366857 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:43.367065 master-0 kubenswrapper[4172]: E0307 21:13:43.366869 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:43.367065 master-0 kubenswrapper[4172]: E0307 21:13:43.366931 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:13:44.366912799 +0000 UTC m=+75.039330696 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:43.657105 master-0 kubenswrapper[4172]: I0307 21:13:43.657030 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="a27fb1b48c1a71a257bd1f0e26afd03b783b613cf862675ed38b35ffc09792a8" exitCode=0 Mar 07 21:13:43.657105 master-0 kubenswrapper[4172]: I0307 21:13:43.657062 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"a27fb1b48c1a71a257bd1f0e26afd03b783b613cf862675ed38b35ffc09792a8"} Mar 07 21:13:43.772268 master-0 kubenswrapper[4172]: I0307 21:13:43.771963 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:43.772268 master-0 kubenswrapper[4172]: E0307 21:13:43.772192 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:43.772587 master-0 kubenswrapper[4172]: E0307 21:13:43.772322 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:13:59.772287855 +0000 UTC m=+90.444705772 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:44.279999 master-0 kubenswrapper[4172]: I0307 21:13:44.279937 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:44.280253 master-0 kubenswrapper[4172]: E0307 21:13:44.280087 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:44.377648 master-0 kubenswrapper[4172]: I0307 21:13:44.377594 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:44.378400 master-0 kubenswrapper[4172]: E0307 21:13:44.377787 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:44.378400 master-0 kubenswrapper[4172]: E0307 21:13:44.377805 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:44.378400 master-0 kubenswrapper[4172]: E0307 21:13:44.377815 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:44.378400 master-0 kubenswrapper[4172]: E0307 21:13:44.377862 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:13:46.377847288 +0000 UTC m=+77.050265185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:45.138994 master-0 kubenswrapper[4172]: I0307 21:13:45.138960 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-node-identity/network-node-identity-kpsm4"] Mar 07 21:13:45.139427 master-0 kubenswrapper[4172]: I0307 21:13:45.139367 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.144476 master-0 kubenswrapper[4172]: I0307 21:13:45.144451 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 21:13:45.144657 master-0 kubenswrapper[4172]: I0307 21:13:45.144580 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 21:13:45.145741 master-0 kubenswrapper[4172]: I0307 21:13:45.144632 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 21:13:45.145958 master-0 kubenswrapper[4172]: I0307 21:13:45.144677 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 21:13:45.145958 master-0 kubenswrapper[4172]: I0307 21:13:45.144781 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 21:13:45.280359 master-0 kubenswrapper[4172]: I0307 21:13:45.280221 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:45.280580 master-0 kubenswrapper[4172]: E0307 21:13:45.280401 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:45.285734 master-0 kubenswrapper[4172]: I0307 21:13:45.285695 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.285813 master-0 kubenswrapper[4172]: I0307 21:13:45.285764 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.285813 master-0 kubenswrapper[4172]: I0307 21:13:45.285792 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.285883 master-0 kubenswrapper[4172]: I0307 21:13:45.285820 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.386654 master-0 kubenswrapper[4172]: I0307 21:13:45.386590 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.386654 master-0 kubenswrapper[4172]: I0307 21:13:45.386642 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.388143 master-0 kubenswrapper[4172]: I0307 21:13:45.386931 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.388143 master-0 kubenswrapper[4172]: I0307 21:13:45.387014 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.388291 master-0 kubenswrapper[4172]: I0307 21:13:45.388160 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.388843 master-0 kubenswrapper[4172]: I0307 21:13:45.388771 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.394516 master-0 kubenswrapper[4172]: I0307 21:13:45.394433 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.405647 master-0 kubenswrapper[4172]: I0307 21:13:45.405577 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.456270 master-0 kubenswrapper[4172]: I0307 21:13:45.456142 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:13:45.472052 master-0 kubenswrapper[4172]: W0307 21:13:45.471967 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod27b149f7_6aff_45f3_b935_e65279f2f9ee.slice/crio-a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767 WatchSource:0}: Error finding container a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767: Status 404 returned error can't find the container with id a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767 Mar 07 21:13:45.664896 master-0 kubenswrapper[4172]: I0307 21:13:45.664809 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerStarted","Data":"a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767"} Mar 07 21:13:45.668655 master-0 kubenswrapper[4172]: I0307 21:13:45.668599 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="22f599619e79420fd9506a4f183f60ba821b3ac500c2322da39e388d594122e4" exitCode=0 Mar 07 21:13:45.668655 master-0 kubenswrapper[4172]: I0307 21:13:45.668644 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"22f599619e79420fd9506a4f183f60ba821b3ac500c2322da39e388d594122e4"} Mar 07 21:13:46.280089 master-0 kubenswrapper[4172]: I0307 21:13:46.280003 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:46.280328 master-0 kubenswrapper[4172]: E0307 21:13:46.280165 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:46.395663 master-0 kubenswrapper[4172]: I0307 21:13:46.395587 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:46.396808 master-0 kubenswrapper[4172]: E0307 21:13:46.395968 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:46.396808 master-0 kubenswrapper[4172]: E0307 21:13:46.396012 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:46.396808 master-0 kubenswrapper[4172]: E0307 21:13:46.396031 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:46.396808 master-0 kubenswrapper[4172]: E0307 21:13:46.396125 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:13:50.396098781 +0000 UTC m=+81.068516678 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:47.280339 master-0 kubenswrapper[4172]: I0307 21:13:47.280233 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:47.280663 master-0 kubenswrapper[4172]: E0307 21:13:47.280433 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:48.279790 master-0 kubenswrapper[4172]: I0307 21:13:48.279720 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:48.280469 master-0 kubenswrapper[4172]: E0307 21:13:48.279952 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:49.279653 master-0 kubenswrapper[4172]: I0307 21:13:49.279602 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:49.279917 master-0 kubenswrapper[4172]: E0307 21:13:49.279821 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:50.279607 master-0 kubenswrapper[4172]: I0307 21:13:50.279545 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:50.280648 master-0 kubenswrapper[4172]: E0307 21:13:50.280250 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:50.436042 master-0 kubenswrapper[4172]: I0307 21:13:50.435975 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:50.436344 master-0 kubenswrapper[4172]: E0307 21:13:50.436303 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:50.436344 master-0 kubenswrapper[4172]: E0307 21:13:50.436338 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:50.436459 master-0 kubenswrapper[4172]: E0307 21:13:50.436357 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:50.436459 master-0 kubenswrapper[4172]: E0307 21:13:50.436436 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:13:58.436412926 +0000 UTC m=+89.108830833 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:51.279979 master-0 kubenswrapper[4172]: I0307 21:13:51.279911 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:51.280246 master-0 kubenswrapper[4172]: E0307 21:13:51.280085 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:52.280124 master-0 kubenswrapper[4172]: I0307 21:13:52.279993 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:52.280643 master-0 kubenswrapper[4172]: E0307 21:13:52.280165 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:53.279761 master-0 kubenswrapper[4172]: I0307 21:13:53.279660 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:53.280522 master-0 kubenswrapper[4172]: E0307 21:13:53.279881 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:54.279724 master-0 kubenswrapper[4172]: I0307 21:13:54.279636 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:54.280065 master-0 kubenswrapper[4172]: E0307 21:13:54.279799 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:55.279944 master-0 kubenswrapper[4172]: I0307 21:13:55.279841 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:55.281093 master-0 kubenswrapper[4172]: E0307 21:13:55.280088 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:56.280408 master-0 kubenswrapper[4172]: I0307 21:13:56.280311 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:56.281063 master-0 kubenswrapper[4172]: E0307 21:13:56.280445 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:56.293937 master-0 kubenswrapper[4172]: I0307 21:13:56.293871 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 07 21:13:57.280126 master-0 kubenswrapper[4172]: I0307 21:13:57.280049 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:57.280445 master-0 kubenswrapper[4172]: E0307 21:13:57.280220 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:58.280902 master-0 kubenswrapper[4172]: I0307 21:13:58.280100 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:58.281580 master-0 kubenswrapper[4172]: E0307 21:13:58.281002 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:13:58.515103 master-0 kubenswrapper[4172]: I0307 21:13:58.515042 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:13:58.515385 master-0 kubenswrapper[4172]: E0307 21:13:58.515333 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:13:58.515442 master-0 kubenswrapper[4172]: E0307 21:13:58.515392 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:13:58.515442 master-0 kubenswrapper[4172]: E0307 21:13:58.515413 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:58.515542 master-0 kubenswrapper[4172]: E0307 21:13:58.515514 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:14.515482907 +0000 UTC m=+105.187900834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:13:58.710617 master-0 kubenswrapper[4172]: I0307 21:13:58.710512 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" event={"ID":"46548c2c-6a8a-4382-87de-2c7a8442a33c","Type":"ContainerStarted","Data":"fd701e4ed1aac8c9685fae0f60e9ef450afe5e5e84030884d9be44f37a388515"} Mar 07 21:13:58.715024 master-0 kubenswrapper[4172]: I0307 21:13:58.714952 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="9574294e43be3e31b38cb24910c3f9a3961ac6d7fef3d8e88cef73fac06c22e3" exitCode=0 Mar 07 21:13:58.715198 master-0 kubenswrapper[4172]: I0307 21:13:58.715025 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"9574294e43be3e31b38cb24910c3f9a3961ac6d7fef3d8e88cef73fac06c22e3"} Mar 07 21:13:58.717121 master-0 kubenswrapper[4172]: I0307 21:13:58.717075 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" exitCode=0 Mar 07 21:13:58.717121 master-0 kubenswrapper[4172]: I0307 21:13:58.717120 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} Mar 07 21:13:58.719253 master-0 kubenswrapper[4172]: I0307 21:13:58.719183 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerStarted","Data":"98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247"} Mar 07 21:13:58.719253 master-0 kubenswrapper[4172]: I0307 21:13:58.719232 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerStarted","Data":"b4c206c3c4663a323dfd8e9366ecb2a5c29f3b592252091d3dcb0f1ec3e2e0b8"} Mar 07 21:13:58.736163 master-0 kubenswrapper[4172]: I0307 21:13:58.735532 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" podStartSLOduration=4.093639727 podStartE2EDuration="19.735504622s" podCreationTimestamp="2026-03-07 21:13:39 +0000 UTC" firstStartedPulling="2026-03-07 21:13:42.49329735 +0000 UTC m=+73.165715257" lastFinishedPulling="2026-03-07 21:13:58.135162235 +0000 UTC m=+88.807580152" observedRunningTime="2026-03-07 21:13:58.735030169 +0000 UTC m=+89.407448096" watchObservedRunningTime="2026-03-07 21:13:58.735504622 +0000 UTC m=+89.407922559" Mar 07 21:13:58.789345 master-0 kubenswrapper[4172]: I0307 21:13:58.789236 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-scheduler-master-0" podStartSLOduration=2.789200631 podStartE2EDuration="2.789200631s" podCreationTimestamp="2026-03-07 21:13:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:13:58.75650675 +0000 UTC m=+89.428924707" watchObservedRunningTime="2026-03-07 21:13:58.789200631 +0000 UTC m=+89.461618528" Mar 07 21:13:58.824266 master-0 kubenswrapper[4172]: I0307 21:13:58.824139 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-node-identity/network-node-identity-kpsm4" podStartSLOduration=1.124158103 podStartE2EDuration="13.824116571s" podCreationTimestamp="2026-03-07 21:13:45 +0000 UTC" firstStartedPulling="2026-03-07 21:13:45.475379924 +0000 UTC m=+76.147797831" lastFinishedPulling="2026-03-07 21:13:58.175338392 +0000 UTC m=+88.847756299" observedRunningTime="2026-03-07 21:13:58.822851628 +0000 UTC m=+89.495269545" watchObservedRunningTime="2026-03-07 21:13:58.824116571 +0000 UTC m=+89.496534478" Mar 07 21:13:59.280371 master-0 kubenswrapper[4172]: I0307 21:13:59.279734 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:59.280552 master-0 kubenswrapper[4172]: E0307 21:13:59.280454 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:13:59.730401 master-0 kubenswrapper[4172]: I0307 21:13:59.730284 4172 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="8fa7422d23bcb03f45ab2c3bec3ea5e6214caa8f28b047daca9c932d4eca1830" exitCode=0 Mar 07 21:13:59.731903 master-0 kubenswrapper[4172]: I0307 21:13:59.730440 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerDied","Data":"8fa7422d23bcb03f45ab2c3bec3ea5e6214caa8f28b047daca9c932d4eca1830"} Mar 07 21:13:59.737314 master-0 kubenswrapper[4172]: I0307 21:13:59.737213 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} Mar 07 21:13:59.737448 master-0 kubenswrapper[4172]: I0307 21:13:59.737321 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} Mar 07 21:13:59.737448 master-0 kubenswrapper[4172]: I0307 21:13:59.737347 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} Mar 07 21:13:59.737448 master-0 kubenswrapper[4172]: I0307 21:13:59.737366 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} Mar 07 21:13:59.737448 master-0 kubenswrapper[4172]: I0307 21:13:59.737387 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:13:59.737448 master-0 kubenswrapper[4172]: I0307 21:13:59.737405 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:13:59.829503 master-0 kubenswrapper[4172]: I0307 21:13:59.829360 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:13:59.829780 master-0 kubenswrapper[4172]: E0307 21:13:59.829557 4172 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:13:59.829780 master-0 kubenswrapper[4172]: E0307 21:13:59.829634 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:31.829608468 +0000 UTC m=+122.502026375 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 07 21:14:00.280028 master-0 kubenswrapper[4172]: I0307 21:14:00.279917 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:00.283645 master-0 kubenswrapper[4172]: E0307 21:14:00.283489 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:00.750183 master-0 kubenswrapper[4172]: I0307 21:14:00.750047 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" event={"ID":"3caff2c1-f178-4e16-916d-27ccf178ff37","Type":"ContainerStarted","Data":"697161f03c6107412ca183d0f72436c188d454ea7e2dbe4d4b57b34e7b253272"} Mar 07 21:14:01.279853 master-0 kubenswrapper[4172]: I0307 21:14:01.279734 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:01.280214 master-0 kubenswrapper[4172]: E0307 21:14:01.279959 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:02.280101 master-0 kubenswrapper[4172]: I0307 21:14:02.279881 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:02.280101 master-0 kubenswrapper[4172]: E0307 21:14:02.280095 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:02.773721 master-0 kubenswrapper[4172]: I0307 21:14:02.773301 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} Mar 07 21:14:03.280768 master-0 kubenswrapper[4172]: I0307 21:14:03.280613 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:03.281922 master-0 kubenswrapper[4172]: E0307 21:14:03.280953 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:04.281260 master-0 kubenswrapper[4172]: I0307 21:14:04.281126 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:04.282224 master-0 kubenswrapper[4172]: E0307 21:14:04.281338 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:04.787785 master-0 kubenswrapper[4172]: I0307 21:14:04.787567 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerStarted","Data":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} Mar 07 21:14:04.788662 master-0 kubenswrapper[4172]: I0307 21:14:04.788185 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:04.788662 master-0 kubenswrapper[4172]: I0307 21:14:04.788227 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:04.788662 master-0 kubenswrapper[4172]: I0307 21:14:04.788257 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:04.822298 master-0 kubenswrapper[4172]: I0307 21:14:04.822211 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:04.827509 master-0 kubenswrapper[4172]: I0307 21:14:04.827450 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:04.828401 master-0 kubenswrapper[4172]: I0307 21:14:04.828283 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podStartSLOduration=9.923240967 podStartE2EDuration="25.828264001s" podCreationTimestamp="2026-03-07 21:13:39 +0000 UTC" firstStartedPulling="2026-03-07 21:13:42.226194888 +0000 UTC m=+72.898612785" lastFinishedPulling="2026-03-07 21:13:58.131217912 +0000 UTC m=+88.803635819" observedRunningTime="2026-03-07 21:14:04.827175282 +0000 UTC m=+95.499593269" watchObservedRunningTime="2026-03-07 21:14:04.828264001 +0000 UTC m=+95.500681948" Mar 07 21:14:04.829144 master-0 kubenswrapper[4172]: I0307 21:14:04.829054 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-xf7kg" podStartSLOduration=7.2631550879999995 podStartE2EDuration="37.829043592s" podCreationTimestamp="2026-03-07 21:13:27 +0000 UTC" firstStartedPulling="2026-03-07 21:13:27.47219961 +0000 UTC m=+58.144617517" lastFinishedPulling="2026-03-07 21:13:58.038088104 +0000 UTC m=+88.710506021" observedRunningTime="2026-03-07 21:14:00.779643729 +0000 UTC m=+91.452061706" watchObservedRunningTime="2026-03-07 21:14:04.829043592 +0000 UTC m=+95.501461519" Mar 07 21:14:05.002500 master-0 kubenswrapper[4172]: I0307 21:14:05.002408 4172 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rqhcv"] Mar 07 21:14:05.279741 master-0 kubenswrapper[4172]: I0307 21:14:05.279633 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:05.280092 master-0 kubenswrapper[4172]: E0307 21:14:05.279854 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:06.280904 master-0 kubenswrapper[4172]: I0307 21:14:06.280278 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:06.281554 master-0 kubenswrapper[4172]: E0307 21:14:06.281085 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:06.421156 master-0 kubenswrapper[4172]: I0307 21:14:06.421042 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fr4qr"] Mar 07 21:14:06.424271 master-0 kubenswrapper[4172]: I0307 21:14:06.424224 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l2bdp"] Mar 07 21:14:06.424487 master-0 kubenswrapper[4172]: I0307 21:14:06.424380 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:06.424565 master-0 kubenswrapper[4172]: E0307 21:14:06.424523 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:06.794507 master-0 kubenswrapper[4172]: I0307 21:14:06.794455 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:06.795095 master-0 kubenswrapper[4172]: E0307 21:14:06.795048 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:06.795952 master-0 kubenswrapper[4172]: I0307 21:14:06.795899 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="northd" containerID="cri-o://eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" gracePeriod=30 Mar 07 21:14:06.796139 master-0 kubenswrapper[4172]: I0307 21:14:06.796052 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="nbdb" containerID="cri-o://a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" gracePeriod=30 Mar 07 21:14:06.796276 master-0 kubenswrapper[4172]: I0307 21:14:06.796053 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="sbdb" containerID="cri-o://0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" gracePeriod=30 Mar 07 21:14:06.796368 master-0 kubenswrapper[4172]: I0307 21:14:06.796310 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-acl-logging" containerID="cri-o://c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" gracePeriod=30 Mar 07 21:14:06.796451 master-0 kubenswrapper[4172]: I0307 21:14:06.796247 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-node" containerID="cri-o://e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" gracePeriod=30 Mar 07 21:14:06.796451 master-0 kubenswrapper[4172]: I0307 21:14:06.795869 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-controller" containerID="cri-o://9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" gracePeriod=30 Mar 07 21:14:06.796598 master-0 kubenswrapper[4172]: I0307 21:14:06.795972 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" gracePeriod=30 Mar 07 21:14:06.827766 master-0 kubenswrapper[4172]: I0307 21:14:06.827659 4172 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovnkube-controller" containerID="cri-o://5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" gracePeriod=30 Mar 07 21:14:07.778085 master-0 kubenswrapper[4172]: I0307 21:14:07.778016 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovnkube-controller/0.log" Mar 07 21:14:07.781143 master-0 kubenswrapper[4172]: I0307 21:14:07.781083 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/kube-rbac-proxy-ovn-metrics/0.log" Mar 07 21:14:07.782034 master-0 kubenswrapper[4172]: I0307 21:14:07.781985 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/kube-rbac-proxy-node/0.log" Mar 07 21:14:07.782952 master-0 kubenswrapper[4172]: I0307 21:14:07.782906 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovn-acl-logging/0.log" Mar 07 21:14:07.783852 master-0 kubenswrapper[4172]: I0307 21:14:07.783807 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovn-controller/0.log" Mar 07 21:14:07.784453 master-0 kubenswrapper[4172]: I0307 21:14:07.784406 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:07.802755 master-0 kubenswrapper[4172]: I0307 21:14:07.802556 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovnkube-controller/0.log" Mar 07 21:14:07.805546 master-0 kubenswrapper[4172]: I0307 21:14:07.805489 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/kube-rbac-proxy-ovn-metrics/0.log" Mar 07 21:14:07.806549 master-0 kubenswrapper[4172]: I0307 21:14:07.806508 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/kube-rbac-proxy-node/0.log" Mar 07 21:14:07.811373 master-0 kubenswrapper[4172]: I0307 21:14:07.811289 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovn-acl-logging/0.log" Mar 07 21:14:07.812967 master-0 kubenswrapper[4172]: I0307 21:14:07.812831 4172 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-rqhcv_b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/ovn-controller/0.log" Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.813934 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" exitCode=2 Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.813982 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" exitCode=0 Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.814005 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" exitCode=0 Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.814028 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" exitCode=0 Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.814045 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" exitCode=143 Mar 07 21:14:07.814082 master-0 kubenswrapper[4172]: I0307 21:14:07.814071 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" exitCode=143 Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814108 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" exitCode=143 Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814124 4172 generic.go:334] "Generic (PLEG): container finished" podID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" exitCode=143 Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814131 4172 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814157 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814220 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814244 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814266 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814291 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} Mar 07 21:14:07.814454 master-0 kubenswrapper[4172]: I0307 21:14:07.814310 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814369 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814525 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814537 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814553 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814571 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814585 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814596 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814608 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814619 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814630 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814642 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814653 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814669 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814732 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814769 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814785 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814802 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814820 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814835 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814851 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814867 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814881 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814896 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814917 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-rqhcv" event={"ID":"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e","Type":"ContainerDied","Data":"86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814941 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814960 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814976 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} Mar 07 21:14:07.815046 master-0 kubenswrapper[4172]: I0307 21:14:07.814987 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.814998 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.815008 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.815019 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.815030 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.815040 4172 pod_container_deletor.go:114] "Failed to issue the request to remove container" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} Mar 07 21:14:07.821738 master-0 kubenswrapper[4172]: I0307 21:14:07.814802 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.849613 master-0 kubenswrapper[4172]: I0307 21:14:07.849448 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.857615 master-0 kubenswrapper[4172]: I0307 21:14:07.857567 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-x9v76"] Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857737 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-node" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857753 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-node" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857763 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="nbdb" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857771 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="nbdb" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857782 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-controller" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857790 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-controller" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857799 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857807 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857816 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="northd" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857823 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="northd" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: E0307 21:14:07.857833 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="sbdb" Mar 07 21:14:07.857825 master-0 kubenswrapper[4172]: I0307 21:14:07.857841 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="sbdb" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: E0307 21:14:07.857850 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovnkube-controller" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857857 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovnkube-controller" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: E0307 21:14:07.857865 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-acl-logging" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857873 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-acl-logging" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: E0307 21:14:07.857881 4172 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kubecfg-setup" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857889 4172 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kubecfg-setup" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857943 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="nbdb" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857990 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-node" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.857999 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="kube-rbac-proxy-ovn-metrics" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.858011 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-controller" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.858019 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovn-acl-logging" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.858027 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="sbdb" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.858035 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="northd" Mar 07 21:14:07.858757 master-0 kubenswrapper[4172]: I0307 21:14:07.858042 4172 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" containerName="ovnkube-controller" Mar 07 21:14:07.860463 master-0 kubenswrapper[4172]: I0307 21:14:07.858987 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:07.868874 master-0 kubenswrapper[4172]: I0307 21:14:07.868832 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.890205 master-0 kubenswrapper[4172]: I0307 21:14:07.890141 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:07.911655 master-0 kubenswrapper[4172]: I0307 21:14:07.911598 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:07.912140 master-0 kubenswrapper[4172]: I0307 21:14:07.912082 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912207 master-0 kubenswrapper[4172]: I0307 21:14:07.912174 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912272 master-0 kubenswrapper[4172]: I0307 21:14:07.912253 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912324 master-0 kubenswrapper[4172]: I0307 21:14:07.912297 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912358 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912399 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912306 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912432 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912471 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912471 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash" (OuterVolumeSpecName: "host-slash") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912527 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912536 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912578 master-0 kubenswrapper[4172]: I0307 21:14:07.912546 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket" (OuterVolumeSpecName: "log-socket") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912588 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912697 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912720 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912776 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912813 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912855 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912875 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912926 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.912960 master-0 kubenswrapper[4172]: I0307 21:14:07.912943 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.912982 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913044 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913091 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913124 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log" (OuterVolumeSpecName: "node-log") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913157 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grms2\" (UniqueName: \"kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913166 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913215 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913253 master-0 kubenswrapper[4172]: I0307 21:14:07.913141 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:14:07.913516 master-0 kubenswrapper[4172]: I0307 21:14:07.913272 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913516 master-0 kubenswrapper[4172]: I0307 21:14:07.913319 4172 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch\") pod \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\" (UID: \"b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e\") " Mar 07 21:14:07.913516 master-0 kubenswrapper[4172]: I0307 21:14:07.913392 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:14:07.913626 master-0 kubenswrapper[4172]: I0307 21:14:07.913551 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.913662 master-0 kubenswrapper[4172]: I0307 21:14:07.913633 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.913762 master-0 kubenswrapper[4172]: I0307 21:14:07.913676 4172 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-var-lib-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913824 master-0 kubenswrapper[4172]: I0307 21:14:07.913780 4172 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-netd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913824 master-0 kubenswrapper[4172]: I0307 21:14:07.913809 4172 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-cni-bin\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913946 master-0 kubenswrapper[4172]: I0307 21:14:07.913837 4172 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-var-lib-cni-networks-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913946 master-0 kubenswrapper[4172]: I0307 21:14:07.913868 4172 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-script-lib\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913946 master-0 kubenswrapper[4172]: I0307 21:14:07.913895 4172 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-node-log\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.913946 master-0 kubenswrapper[4172]: I0307 21:14:07.913921 4172 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-etc-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.913950 4172 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-openvswitch\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.913977 4172 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914002 4172 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-env-overrides\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914031 4172 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-slash\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914059 4172 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-ovn-kubernetes\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914088 4172 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-run-netns\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914116 4172 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-log-socket\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.913717 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.914164 master-0 kubenswrapper[4172]: I0307 21:14:07.914044 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:14:07.921305 master-0 kubenswrapper[4172]: I0307 21:14:07.921209 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:14:07.922579 master-0 kubenswrapper[4172]: I0307 21:14:07.922505 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2" (OuterVolumeSpecName: "kube-api-access-grms2") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "kube-api-access-grms2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:14:07.927007 master-0 kubenswrapper[4172]: I0307 21:14:07.926830 4172 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" (UID: "b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:14:07.930253 master-0 kubenswrapper[4172]: I0307 21:14:07.930115 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:07.943191 master-0 kubenswrapper[4172]: I0307 21:14:07.943009 4172 scope.go:117] "RemoveContainer" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:07.958282 master-0 kubenswrapper[4172]: I0307 21:14:07.958066 4172 scope.go:117] "RemoveContainer" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:07.971877 master-0 kubenswrapper[4172]: I0307 21:14:07.971844 4172 scope.go:117] "RemoveContainer" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:07.983109 master-0 kubenswrapper[4172]: I0307 21:14:07.982771 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.983753 master-0 kubenswrapper[4172]: E0307 21:14:07.983632 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.983890 master-0 kubenswrapper[4172]: I0307 21:14:07.983762 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} err="failed to get container status \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" Mar 07 21:14:07.983890 master-0 kubenswrapper[4172]: I0307 21:14:07.983820 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.984483 master-0 kubenswrapper[4172]: E0307 21:14:07.984433 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.984575 master-0 kubenswrapper[4172]: I0307 21:14:07.984472 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} err="failed to get container status \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" Mar 07 21:14:07.984575 master-0 kubenswrapper[4172]: I0307 21:14:07.984510 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.985356 master-0 kubenswrapper[4172]: E0307 21:14:07.985287 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.985356 master-0 kubenswrapper[4172]: I0307 21:14:07.985338 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} err="failed to get container status \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" Mar 07 21:14:07.985505 master-0 kubenswrapper[4172]: I0307 21:14:07.985372 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:07.985909 master-0 kubenswrapper[4172]: E0307 21:14:07.985850 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:07.985985 master-0 kubenswrapper[4172]: I0307 21:14:07.985910 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} err="failed to get container status \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" Mar 07 21:14:07.985985 master-0 kubenswrapper[4172]: I0307 21:14:07.985954 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:07.986413 master-0 kubenswrapper[4172]: E0307 21:14:07.986355 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:07.986413 master-0 kubenswrapper[4172]: I0307 21:14:07.986402 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} err="failed to get container status \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" Mar 07 21:14:07.986552 master-0 kubenswrapper[4172]: I0307 21:14:07.986432 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:07.987001 master-0 kubenswrapper[4172]: E0307 21:14:07.986927 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:07.987132 master-0 kubenswrapper[4172]: I0307 21:14:07.987026 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} err="failed to get container status \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" Mar 07 21:14:07.987132 master-0 kubenswrapper[4172]: I0307 21:14:07.987046 4172 scope.go:117] "RemoveContainer" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:07.987776 master-0 kubenswrapper[4172]: E0307 21:14:07.987717 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": container with ID starting with c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510 not found: ID does not exist" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:07.987863 master-0 kubenswrapper[4172]: I0307 21:14:07.987770 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} err="failed to get container status \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": rpc error: code = NotFound desc = could not find container \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": container with ID starting with c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510 not found: ID does not exist" Mar 07 21:14:07.987863 master-0 kubenswrapper[4172]: I0307 21:14:07.987801 4172 scope.go:117] "RemoveContainer" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:07.988342 master-0 kubenswrapper[4172]: E0307 21:14:07.988254 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": container with ID starting with 9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9 not found: ID does not exist" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:07.988342 master-0 kubenswrapper[4172]: I0307 21:14:07.988319 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} err="failed to get container status \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": rpc error: code = NotFound desc = could not find container \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": container with ID starting with 9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9 not found: ID does not exist" Mar 07 21:14:07.988487 master-0 kubenswrapper[4172]: I0307 21:14:07.988355 4172 scope.go:117] "RemoveContainer" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:07.989209 master-0 kubenswrapper[4172]: E0307 21:14:07.989130 4172 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": container with ID starting with 8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950 not found: ID does not exist" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:07.989312 master-0 kubenswrapper[4172]: I0307 21:14:07.989215 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} err="failed to get container status \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": rpc error: code = NotFound desc = could not find container \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": container with ID starting with 8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950 not found: ID does not exist" Mar 07 21:14:07.989312 master-0 kubenswrapper[4172]: I0307 21:14:07.989272 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.990420 master-0 kubenswrapper[4172]: I0307 21:14:07.990358 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} err="failed to get container status \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" Mar 07 21:14:07.990420 master-0 kubenswrapper[4172]: I0307 21:14:07.990393 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.990867 master-0 kubenswrapper[4172]: I0307 21:14:07.990800 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} err="failed to get container status \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" Mar 07 21:14:07.990867 master-0 kubenswrapper[4172]: I0307 21:14:07.990853 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.991194 master-0 kubenswrapper[4172]: I0307 21:14:07.991141 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} err="failed to get container status \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" Mar 07 21:14:07.991194 master-0 kubenswrapper[4172]: I0307 21:14:07.991178 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:07.992002 master-0 kubenswrapper[4172]: I0307 21:14:07.991900 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} err="failed to get container status \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" Mar 07 21:14:07.992002 master-0 kubenswrapper[4172]: I0307 21:14:07.991985 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:07.992513 master-0 kubenswrapper[4172]: I0307 21:14:07.992461 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} err="failed to get container status \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" Mar 07 21:14:07.992513 master-0 kubenswrapper[4172]: I0307 21:14:07.992498 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:07.992995 master-0 kubenswrapper[4172]: I0307 21:14:07.992932 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} err="failed to get container status \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" Mar 07 21:14:07.992995 master-0 kubenswrapper[4172]: I0307 21:14:07.992984 4172 scope.go:117] "RemoveContainer" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:07.993419 master-0 kubenswrapper[4172]: I0307 21:14:07.993357 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} err="failed to get container status \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": rpc error: code = NotFound desc = could not find container \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": container with ID starting with c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510 not found: ID does not exist" Mar 07 21:14:07.993419 master-0 kubenswrapper[4172]: I0307 21:14:07.993401 4172 scope.go:117] "RemoveContainer" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:07.994829 master-0 kubenswrapper[4172]: I0307 21:14:07.994742 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} err="failed to get container status \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": rpc error: code = NotFound desc = could not find container \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": container with ID starting with 9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9 not found: ID does not exist" Mar 07 21:14:07.994829 master-0 kubenswrapper[4172]: I0307 21:14:07.994816 4172 scope.go:117] "RemoveContainer" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:07.995262 master-0 kubenswrapper[4172]: I0307 21:14:07.995215 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} err="failed to get container status \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": rpc error: code = NotFound desc = could not find container \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": container with ID starting with 8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950 not found: ID does not exist" Mar 07 21:14:07.995262 master-0 kubenswrapper[4172]: I0307 21:14:07.995241 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.995650 master-0 kubenswrapper[4172]: I0307 21:14:07.995600 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} err="failed to get container status \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" Mar 07 21:14:07.995650 master-0 kubenswrapper[4172]: I0307 21:14:07.995626 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.995946 master-0 kubenswrapper[4172]: I0307 21:14:07.995903 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} err="failed to get container status \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" Mar 07 21:14:07.995946 master-0 kubenswrapper[4172]: I0307 21:14:07.995930 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.996277 master-0 kubenswrapper[4172]: I0307 21:14:07.996217 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} err="failed to get container status \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" Mar 07 21:14:07.996277 master-0 kubenswrapper[4172]: I0307 21:14:07.996261 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:07.996736 master-0 kubenswrapper[4172]: I0307 21:14:07.996631 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} err="failed to get container status \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" Mar 07 21:14:07.996736 master-0 kubenswrapper[4172]: I0307 21:14:07.996671 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:07.997039 master-0 kubenswrapper[4172]: I0307 21:14:07.996957 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} err="failed to get container status \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" Mar 07 21:14:07.997039 master-0 kubenswrapper[4172]: I0307 21:14:07.997012 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:07.997341 master-0 kubenswrapper[4172]: I0307 21:14:07.997282 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} err="failed to get container status \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" Mar 07 21:14:07.997341 master-0 kubenswrapper[4172]: I0307 21:14:07.997321 4172 scope.go:117] "RemoveContainer" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:07.997769 master-0 kubenswrapper[4172]: I0307 21:14:07.997651 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} err="failed to get container status \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": rpc error: code = NotFound desc = could not find container \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": container with ID starting with c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510 not found: ID does not exist" Mar 07 21:14:07.997769 master-0 kubenswrapper[4172]: I0307 21:14:07.997727 4172 scope.go:117] "RemoveContainer" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:07.998120 master-0 kubenswrapper[4172]: I0307 21:14:07.998061 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} err="failed to get container status \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": rpc error: code = NotFound desc = could not find container \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": container with ID starting with 9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9 not found: ID does not exist" Mar 07 21:14:07.998120 master-0 kubenswrapper[4172]: I0307 21:14:07.998101 4172 scope.go:117] "RemoveContainer" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:07.998489 master-0 kubenswrapper[4172]: I0307 21:14:07.998439 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} err="failed to get container status \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": rpc error: code = NotFound desc = could not find container \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": container with ID starting with 8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950 not found: ID does not exist" Mar 07 21:14:07.998489 master-0 kubenswrapper[4172]: I0307 21:14:07.998471 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:07.998803 master-0 kubenswrapper[4172]: I0307 21:14:07.998758 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} err="failed to get container status \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" Mar 07 21:14:07.998803 master-0 kubenswrapper[4172]: I0307 21:14:07.998793 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:07.999285 master-0 kubenswrapper[4172]: I0307 21:14:07.999237 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} err="failed to get container status \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" Mar 07 21:14:07.999285 master-0 kubenswrapper[4172]: I0307 21:14:07.999263 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:07.999629 master-0 kubenswrapper[4172]: I0307 21:14:07.999557 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} err="failed to get container status \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" Mar 07 21:14:07.999629 master-0 kubenswrapper[4172]: I0307 21:14:07.999611 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:08.000179 master-0 kubenswrapper[4172]: I0307 21:14:08.000121 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} err="failed to get container status \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" Mar 07 21:14:08.000179 master-0 kubenswrapper[4172]: I0307 21:14:08.000160 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:08.000718 master-0 kubenswrapper[4172]: I0307 21:14:08.000616 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} err="failed to get container status \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" Mar 07 21:14:08.000718 master-0 kubenswrapper[4172]: I0307 21:14:08.000662 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:08.001123 master-0 kubenswrapper[4172]: I0307 21:14:08.001074 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} err="failed to get container status \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" Mar 07 21:14:08.001123 master-0 kubenswrapper[4172]: I0307 21:14:08.001103 4172 scope.go:117] "RemoveContainer" containerID="c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510" Mar 07 21:14:08.001529 master-0 kubenswrapper[4172]: I0307 21:14:08.001466 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510"} err="failed to get container status \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": rpc error: code = NotFound desc = could not find container \"c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510\": container with ID starting with c4afe1cfb9677c14eb427f9db0dc5ec670feae573064ce8f8cbc2195da42e510 not found: ID does not exist" Mar 07 21:14:08.001529 master-0 kubenswrapper[4172]: I0307 21:14:08.001506 4172 scope.go:117] "RemoveContainer" containerID="9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9" Mar 07 21:14:08.001917 master-0 kubenswrapper[4172]: I0307 21:14:08.001842 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9"} err="failed to get container status \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": rpc error: code = NotFound desc = could not find container \"9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9\": container with ID starting with 9cce6915772592bfdbe2c97cbb28ce58a703a47bd7521752cbac35b8b3ab0aa9 not found: ID does not exist" Mar 07 21:14:08.001917 master-0 kubenswrapper[4172]: I0307 21:14:08.001900 4172 scope.go:117] "RemoveContainer" containerID="8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950" Mar 07 21:14:08.002334 master-0 kubenswrapper[4172]: I0307 21:14:08.002279 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950"} err="failed to get container status \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": rpc error: code = NotFound desc = could not find container \"8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950\": container with ID starting with 8aa37267eb159409459b4037fd5042a510ae38ddf3b7b46804c52f1d30342950 not found: ID does not exist" Mar 07 21:14:08.002334 master-0 kubenswrapper[4172]: I0307 21:14:08.002300 4172 scope.go:117] "RemoveContainer" containerID="5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff" Mar 07 21:14:08.002777 master-0 kubenswrapper[4172]: I0307 21:14:08.002667 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff"} err="failed to get container status \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": rpc error: code = NotFound desc = could not find container \"5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff\": container with ID starting with 5978d6341931b7245059beb880279561fbb356d91c86f6185d80be1009ba25ff not found: ID does not exist" Mar 07 21:14:08.002777 master-0 kubenswrapper[4172]: I0307 21:14:08.002773 4172 scope.go:117] "RemoveContainer" containerID="0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7" Mar 07 21:14:08.003238 master-0 kubenswrapper[4172]: I0307 21:14:08.003191 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7"} err="failed to get container status \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": rpc error: code = NotFound desc = could not find container \"0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7\": container with ID starting with 0a7cf22d4203d43d95d04f363d976e79b6578a3c04902536372a18da12f4dbc7 not found: ID does not exist" Mar 07 21:14:08.003238 master-0 kubenswrapper[4172]: I0307 21:14:08.003215 4172 scope.go:117] "RemoveContainer" containerID="a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896" Mar 07 21:14:08.003620 master-0 kubenswrapper[4172]: I0307 21:14:08.003546 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896"} err="failed to get container status \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": rpc error: code = NotFound desc = could not find container \"a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896\": container with ID starting with a8740b3eb3c53f217aae29758a548417478f7cb045139e6deabca9f8011f1896 not found: ID does not exist" Mar 07 21:14:08.003620 master-0 kubenswrapper[4172]: I0307 21:14:08.003607 4172 scope.go:117] "RemoveContainer" containerID="eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9" Mar 07 21:14:08.004153 master-0 kubenswrapper[4172]: I0307 21:14:08.004106 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9"} err="failed to get container status \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": rpc error: code = NotFound desc = could not find container \"eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9\": container with ID starting with eb549c7995dac7295a61c1d30be9df27322df96391baf42f44c49fb7f42afaf9 not found: ID does not exist" Mar 07 21:14:08.004153 master-0 kubenswrapper[4172]: I0307 21:14:08.004129 4172 scope.go:117] "RemoveContainer" containerID="f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277" Mar 07 21:14:08.004469 master-0 kubenswrapper[4172]: I0307 21:14:08.004429 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277"} err="failed to get container status \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": rpc error: code = NotFound desc = could not find container \"f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277\": container with ID starting with f09be9b7bf90f5b92799cf8a0a280ae69de83911a346da6f698e86265126a277 not found: ID does not exist" Mar 07 21:14:08.004469 master-0 kubenswrapper[4172]: I0307 21:14:08.004451 4172 scope.go:117] "RemoveContainer" containerID="e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb" Mar 07 21:14:08.004855 master-0 kubenswrapper[4172]: I0307 21:14:08.004814 4172 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb"} err="failed to get container status \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": rpc error: code = NotFound desc = could not find container \"e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb\": container with ID starting with e282d4717f70735b6ae08fe79afbe6aa5f0865d1ea02ad95c2f7ae56af7e2feb not found: ID does not exist" Mar 07 21:14:08.015436 master-0 kubenswrapper[4172]: I0307 21:14:08.015315 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015436 master-0 kubenswrapper[4172]: I0307 21:14:08.015383 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015607 master-0 kubenswrapper[4172]: I0307 21:14:08.015456 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015607 master-0 kubenswrapper[4172]: I0307 21:14:08.015490 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015607 master-0 kubenswrapper[4172]: I0307 21:14:08.015523 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015607 master-0 kubenswrapper[4172]: I0307 21:14:08.015557 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015926 master-0 kubenswrapper[4172]: I0307 21:14:08.015672 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015926 master-0 kubenswrapper[4172]: I0307 21:14:08.015785 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015926 master-0 kubenswrapper[4172]: I0307 21:14:08.015823 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.015926 master-0 kubenswrapper[4172]: I0307 21:14:08.015860 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016175 master-0 kubenswrapper[4172]: I0307 21:14:08.015936 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016175 master-0 kubenswrapper[4172]: I0307 21:14:08.016003 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016175 master-0 kubenswrapper[4172]: I0307 21:14:08.016103 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016345 master-0 kubenswrapper[4172]: I0307 21:14:08.016180 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016345 master-0 kubenswrapper[4172]: I0307 21:14:08.016266 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016455 master-0 kubenswrapper[4172]: I0307 21:14:08.016371 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016527 master-0 kubenswrapper[4172]: I0307 21:14:08.016483 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016588 master-0 kubenswrapper[4172]: I0307 21:14:08.016561 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016667 master-0 kubenswrapper[4172]: I0307 21:14:08.016605 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016778 master-0 kubenswrapper[4172]: I0307 21:14:08.016704 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.016852 master-0 kubenswrapper[4172]: I0307 21:14:08.016789 4172 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-run-systemd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.016852 master-0 kubenswrapper[4172]: I0307 21:14:08.016819 4172 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovn-node-metrics-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.016852 master-0 kubenswrapper[4172]: I0307 21:14:08.016842 4172 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grms2\" (UniqueName: \"kubernetes.io/projected/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-kube-api-access-grms2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.017028 master-0 kubenswrapper[4172]: I0307 21:14:08.016861 4172 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-ovnkube-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.017028 master-0 kubenswrapper[4172]: I0307 21:14:08.016884 4172 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-host-kubelet\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.017028 master-0 kubenswrapper[4172]: I0307 21:14:08.016902 4172 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e-systemd-units\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:08.118886 master-0 kubenswrapper[4172]: I0307 21:14:08.118766 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119095 master-0 kubenswrapper[4172]: I0307 21:14:08.118922 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119095 master-0 kubenswrapper[4172]: I0307 21:14:08.118937 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119095 master-0 kubenswrapper[4172]: I0307 21:14:08.118993 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119272 master-0 kubenswrapper[4172]: I0307 21:14:08.119107 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119272 master-0 kubenswrapper[4172]: I0307 21:14:08.119182 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119272 master-0 kubenswrapper[4172]: I0307 21:14:08.119224 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119434 master-0 kubenswrapper[4172]: I0307 21:14:08.119361 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119662 master-0 kubenswrapper[4172]: I0307 21:14:08.119573 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119827 master-0 kubenswrapper[4172]: I0307 21:14:08.119770 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119908 master-0 kubenswrapper[4172]: I0307 21:14:08.119858 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.119975 master-0 kubenswrapper[4172]: I0307 21:14:08.119916 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120067 master-0 kubenswrapper[4172]: I0307 21:14:08.120027 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120127 master-0 kubenswrapper[4172]: I0307 21:14:08.120081 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120184 master-0 kubenswrapper[4172]: I0307 21:14:08.120094 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120184 master-0 kubenswrapper[4172]: I0307 21:14:08.120141 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120301 master-0 kubenswrapper[4172]: I0307 21:14:08.120162 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120301 master-0 kubenswrapper[4172]: I0307 21:14:08.120193 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120301 master-0 kubenswrapper[4172]: I0307 21:14:08.120042 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120597 master-0 kubenswrapper[4172]: I0307 21:14:08.120279 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120852 master-0 kubenswrapper[4172]: I0307 21:14:08.120645 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120852 master-0 kubenswrapper[4172]: I0307 21:14:08.120727 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.120852 master-0 kubenswrapper[4172]: I0307 21:14:08.120797 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121011 master-0 kubenswrapper[4172]: I0307 21:14:08.120859 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121011 master-0 kubenswrapper[4172]: I0307 21:14:08.120934 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121011 master-0 kubenswrapper[4172]: I0307 21:14:08.120992 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.121026 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.120992 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.121078 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.121120 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.121133 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121180 master-0 kubenswrapper[4172]: I0307 21:14:08.121161 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121489 master-0 kubenswrapper[4172]: I0307 21:14:08.121201 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121489 master-0 kubenswrapper[4172]: I0307 21:14:08.121222 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121489 master-0 kubenswrapper[4172]: I0307 21:14:08.121257 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.121489 master-0 kubenswrapper[4172]: I0307 21:14:08.121296 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.122760 master-0 kubenswrapper[4172]: I0307 21:14:08.122474 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.122760 master-0 kubenswrapper[4172]: I0307 21:14:08.122592 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.127139 master-0 kubenswrapper[4172]: I0307 21:14:08.127062 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.138512 master-0 kubenswrapper[4172]: I0307 21:14:08.138443 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.173067 master-0 kubenswrapper[4172]: I0307 21:14:08.172993 4172 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rqhcv"] Mar 07 21:14:08.181104 master-0 kubenswrapper[4172]: I0307 21:14:08.181017 4172 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-rqhcv"] Mar 07 21:14:08.182748 master-0 kubenswrapper[4172]: I0307 21:14:08.182090 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:08.200167 master-0 kubenswrapper[4172]: W0307 21:14:08.200089 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod420c6d8f_6313_4d6c_b817_420797fc6878.slice/crio-653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319 WatchSource:0}: Error finding container 653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319: Status 404 returned error can't find the container with id 653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319 Mar 07 21:14:08.280894 master-0 kubenswrapper[4172]: I0307 21:14:08.280723 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:08.280894 master-0 kubenswrapper[4172]: I0307 21:14:08.280850 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:08.281092 master-0 kubenswrapper[4172]: E0307 21:14:08.280954 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:08.281092 master-0 kubenswrapper[4172]: E0307 21:14:08.281059 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:08.289372 master-0 kubenswrapper[4172]: I0307 21:14:08.289294 4172 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e" path="/var/lib/kubelet/pods/b8ea3e79-b0a0-4c22-a60f-c1d1d972fc0e/volumes" Mar 07 21:14:08.819222 master-0 kubenswrapper[4172]: I0307 21:14:08.819087 4172 generic.go:334] "Generic (PLEG): container finished" podID="420c6d8f-6313-4d6c-b817-420797fc6878" containerID="89e83b02510db448aa7211c7a69aa7fdf926031ee29094a8ecb9aeeb18ccc925" exitCode=0 Mar 07 21:14:08.819222 master-0 kubenswrapper[4172]: I0307 21:14:08.819184 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerDied","Data":"89e83b02510db448aa7211c7a69aa7fdf926031ee29094a8ecb9aeeb18ccc925"} Mar 07 21:14:08.820320 master-0 kubenswrapper[4172]: I0307 21:14:08.819540 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319"} Mar 07 21:14:09.831621 master-0 kubenswrapper[4172]: I0307 21:14:09.831150 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"a8398a9e14c05745ac063f18640a690e994cd0b728b6dd762b9567e513d16e8b"} Mar 07 21:14:09.831621 master-0 kubenswrapper[4172]: I0307 21:14:09.831607 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"c42557909445f173a160d9b5c30b3fd2a5f5644903a696fb30e5913cb6af9ab1"} Mar 07 21:14:09.831621 master-0 kubenswrapper[4172]: I0307 21:14:09.831636 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"92a3827265ebc1406082ad2a5425f3421662ffdb4295f0fff0bfcee346134b5b"} Mar 07 21:14:09.833025 master-0 kubenswrapper[4172]: I0307 21:14:09.831655 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"8013d6b8f5ac7974c4e6b6bcd5cfedc79a316cacb11f184cbbf99cdfc45d4a14"} Mar 07 21:14:09.833025 master-0 kubenswrapper[4172]: I0307 21:14:09.831674 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"bf26c349cf30c564e968b874b57980a11f507ef607dc7c5a1fd71fe68e3854bc"} Mar 07 21:14:09.833025 master-0 kubenswrapper[4172]: I0307 21:14:09.831728 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"444baff00833fd48e28aa5ce389e12293793724945a2ae8dad067a2189532602"} Mar 07 21:14:10.281176 master-0 kubenswrapper[4172]: I0307 21:14:10.281074 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:10.281525 master-0 kubenswrapper[4172]: I0307 21:14:10.281114 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:10.282347 master-0 kubenswrapper[4172]: E0307 21:14:10.282258 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:10.282568 master-0 kubenswrapper[4172]: E0307 21:14:10.282491 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:12.060352 master-0 kubenswrapper[4172]: I0307 21:14:12.060221 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:12.061640 master-0 kubenswrapper[4172]: E0307 21:14:12.060497 4172 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:12.061640 master-0 kubenswrapper[4172]: E0307 21:14:12.060581 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:15:16.06055439 +0000 UTC m=+166.732972317 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:12.280266 master-0 kubenswrapper[4172]: I0307 21:14:12.280050 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:12.280902 master-0 kubenswrapper[4172]: I0307 21:14:12.280186 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:12.281065 master-0 kubenswrapper[4172]: E0307 21:14:12.280840 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:12.281480 master-0 kubenswrapper[4172]: E0307 21:14:12.281415 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:12.858006 master-0 kubenswrapper[4172]: I0307 21:14:12.857872 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"6dbc568c9dbf3d115ebfe55af9871029f2effc0847fcb9f729038a26c05ba2d9"} Mar 07 21:14:14.280428 master-0 kubenswrapper[4172]: I0307 21:14:14.280289 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:14.280428 master-0 kubenswrapper[4172]: I0307 21:14:14.280428 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:14.281593 master-0 kubenswrapper[4172]: E0307 21:14:14.280564 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:14.281593 master-0 kubenswrapper[4172]: E0307 21:14:14.280863 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:14.570673 master-0 kubenswrapper[4172]: I0307 21:14:14.570411 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 07 21:14:14.600537 master-0 kubenswrapper[4172]: I0307 21:14:14.587733 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:14.600537 master-0 kubenswrapper[4172]: E0307 21:14:14.588008 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 07 21:14:14.600537 master-0 kubenswrapper[4172]: E0307 21:14:14.588044 4172 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 07 21:14:14.600537 master-0 kubenswrapper[4172]: E0307 21:14:14.588064 4172 projected.go:194] Error preparing data for projected volume kube-api-access-qwzgb for pod openshift-network-diagnostics/network-check-target-fr4qr: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:14:14.600537 master-0 kubenswrapper[4172]: E0307 21:14:14.588159 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb podName:15270349-f3aa-43bc-88a8-f0fff3aa2528 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:46.588131638 +0000 UTC m=+137.260549565 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-qwzgb" (UniqueName: "kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb") pod "network-check-target-fr4qr" (UID: "15270349-f3aa-43bc-88a8-f0fff3aa2528") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 07 21:14:14.875224 master-0 kubenswrapper[4172]: I0307 21:14:14.875013 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" event={"ID":"420c6d8f-6313-4d6c-b817-420797fc6878","Type":"ContainerStarted","Data":"466cb520ce392ff289bdc1c36ead735e8258d92add4800b4b92beca2519b338f"} Mar 07 21:14:14.875803 master-0 kubenswrapper[4172]: I0307 21:14:14.875602 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:14.875803 master-0 kubenswrapper[4172]: I0307 21:14:14.875675 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:14.918537 master-0 kubenswrapper[4172]: I0307 21:14:14.917789 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:14.918537 master-0 kubenswrapper[4172]: I0307 21:14:14.918493 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:14.972844 master-0 kubenswrapper[4172]: I0307 21:14:14.971202 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" podStartSLOduration=7.97116063 podStartE2EDuration="7.97116063s" podCreationTimestamp="2026-03-07 21:14:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:14.969721152 +0000 UTC m=+105.642139089" watchObservedRunningTime="2026-03-07 21:14:14.97116063 +0000 UTC m=+105.643578567" Mar 07 21:14:15.110869 master-0 kubenswrapper[4172]: I0307 21:14:15.110778 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podStartSLOduration=1.110759048 podStartE2EDuration="1.110759048s" podCreationTimestamp="2026-03-07 21:14:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:15.029243344 +0000 UTC m=+105.701661281" watchObservedRunningTime="2026-03-07 21:14:15.110759048 +0000 UTC m=+105.783176945" Mar 07 21:14:15.480520 master-0 kubenswrapper[4172]: I0307 21:14:15.480454 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 07 21:14:15.878957 master-0 kubenswrapper[4172]: I0307 21:14:15.878891 4172 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:16.280726 master-0 kubenswrapper[4172]: I0307 21:14:16.280569 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:16.280993 master-0 kubenswrapper[4172]: E0307 21:14:16.280875 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:16.281423 master-0 kubenswrapper[4172]: I0307 21:14:16.281381 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:16.281627 master-0 kubenswrapper[4172]: E0307 21:14:16.281585 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:18.280608 master-0 kubenswrapper[4172]: I0307 21:14:18.280507 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:18.280608 master-0 kubenswrapper[4172]: I0307 21:14:18.280565 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:18.282185 master-0 kubenswrapper[4172]: E0307 21:14:18.280713 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:18.282185 master-0 kubenswrapper[4172]: E0307 21:14:18.280852 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:20.279709 master-0 kubenswrapper[4172]: I0307 21:14:20.279571 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:20.280730 master-0 kubenswrapper[4172]: I0307 21:14:20.279783 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:20.280857 master-0 kubenswrapper[4172]: E0307 21:14:20.280794 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-l2bdp" podUID="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" Mar 07 21:14:20.281125 master-0 kubenswrapper[4172]: E0307 21:14:20.280982 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-fr4qr" podUID="15270349-f3aa-43bc-88a8-f0fff3aa2528" Mar 07 21:14:20.308361 master-0 kubenswrapper[4172]: I0307 21:14:20.308186 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/bootstrap-kube-controller-manager-master-0" podStartSLOduration=5.308153341 podStartE2EDuration="5.308153341s" podCreationTimestamp="2026-03-07 21:14:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:20.304925067 +0000 UTC m=+110.977343034" watchObservedRunningTime="2026-03-07 21:14:20.308153341 +0000 UTC m=+110.980571268" Mar 07 21:14:21.155105 master-0 kubenswrapper[4172]: I0307 21:14:21.154542 4172 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeReady" Mar 07 21:14:21.155407 master-0 kubenswrapper[4172]: I0307 21:14:21.155288 4172 kubelet_node_status.go:538] "Fast updating node status as it just became ready" Mar 07 21:14:21.202656 master-0 kubenswrapper[4172]: I0307 21:14:21.202446 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd"] Mar 07 21:14:21.203717 master-0 kubenswrapper[4172]: I0307 21:14:21.203640 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.205273 master-0 kubenswrapper[4172]: I0307 21:14:21.205238 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q"] Mar 07 21:14:21.206036 master-0 kubenswrapper[4172]: I0307 21:14:21.205999 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.212753 master-0 kubenswrapper[4172]: I0307 21:14:21.208565 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m"] Mar 07 21:14:21.212753 master-0 kubenswrapper[4172]: I0307 21:14:21.211279 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.215656 master-0 kubenswrapper[4172]: I0307 21:14:21.215588 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg"] Mar 07 21:14:21.216416 master-0 kubenswrapper[4172]: I0307 21:14:21.216363 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.217861 master-0 kubenswrapper[4172]: I0307 21:14:21.217791 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f"] Mar 07 21:14:21.218070 master-0 kubenswrapper[4172]: I0307 21:14:21.218018 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 21:14:21.218469 master-0 kubenswrapper[4172]: I0307 21:14:21.218419 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.225008 master-0 kubenswrapper[4172]: I0307 21:14:21.224935 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz"] Mar 07 21:14:21.225511 master-0 kubenswrapper[4172]: I0307 21:14:21.225445 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 21:14:21.226209 master-0 kubenswrapper[4172]: I0307 21:14:21.226166 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.231451 master-0 kubenswrapper[4172]: I0307 21:14:21.231368 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 21:14:21.231921 master-0 kubenswrapper[4172]: I0307 21:14:21.231826 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft"] Mar 07 21:14:21.236728 master-0 kubenswrapper[4172]: I0307 21:14:21.234174 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.236728 master-0 kubenswrapper[4172]: I0307 21:14:21.234472 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 21:14:21.236728 master-0 kubenswrapper[4172]: I0307 21:14:21.235138 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 07 21:14:21.236728 master-0 kubenswrapper[4172]: I0307 21:14:21.235805 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-tklw9"] Mar 07 21:14:21.238486 master-0 kubenswrapper[4172]: I0307 21:14:21.238422 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 07 21:14:21.240091 master-0 kubenswrapper[4172]: I0307 21:14:21.240039 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 21:14:21.240371 master-0 kubenswrapper[4172]: I0307 21:14:21.240328 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 07 21:14:21.243098 master-0 kubenswrapper[4172]: I0307 21:14:21.240121 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.243098 master-0 kubenswrapper[4172]: I0307 21:14:21.240110 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 21:14:21.243098 master-0 kubenswrapper[4172]: I0307 21:14:21.241155 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.253433 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.254243 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.254570 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.256946 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh"] Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.257586 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz"] Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.257955 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.258202 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 21:14:21.258781 master-0 kubenswrapper[4172]: I0307 21:14:21.258601 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.260594 master-0 kubenswrapper[4172]: I0307 21:14:21.260560 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 21:14:21.260944 master-0 kubenswrapper[4172]: I0307 21:14:21.260886 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 21:14:21.261004 master-0 kubenswrapper[4172]: I0307 21:14:21.260908 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 07 21:14:21.263263 master-0 kubenswrapper[4172]: I0307 21:14:21.263212 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 21:14:21.263375 master-0 kubenswrapper[4172]: I0307 21:14:21.263354 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 21:14:21.263566 master-0 kubenswrapper[4172]: I0307 21:14:21.263435 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:14:21.263778 master-0 kubenswrapper[4172]: I0307 21:14:21.263747 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x"] Mar 07 21:14:21.263778 master-0 kubenswrapper[4172]: I0307 21:14:21.263761 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 21:14:21.263895 master-0 kubenswrapper[4172]: I0307 21:14:21.263778 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 21:14:21.264401 master-0 kubenswrapper[4172]: I0307 21:14:21.264331 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 21:14:21.265845 master-0 kubenswrapper[4172]: I0307 21:14:21.264908 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.265845 master-0 kubenswrapper[4172]: I0307 21:14:21.264928 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 21:14:21.265845 master-0 kubenswrapper[4172]: I0307 21:14:21.265271 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 21:14:21.269280 master-0 kubenswrapper[4172]: I0307 21:14:21.269247 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf"] Mar 07 21:14:21.269448 master-0 kubenswrapper[4172]: I0307 21:14:21.269406 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.269673 master-0 kubenswrapper[4172]: I0307 21:14:21.269621 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 21:14:21.269673 master-0 kubenswrapper[4172]: I0307 21:14:21.269637 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 21:14:21.269889 master-0 kubenswrapper[4172]: I0307 21:14:21.269857 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.270321 master-0 kubenswrapper[4172]: I0307 21:14:21.270300 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6"] Mar 07 21:14:21.272219 master-0 kubenswrapper[4172]: I0307 21:14:21.270598 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.272476 master-0 kubenswrapper[4172]: I0307 21:14:21.272456 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h"] Mar 07 21:14:21.273061 master-0 kubenswrapper[4172]: I0307 21:14:21.273029 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.273177 master-0 kubenswrapper[4172]: I0307 21:14:21.273130 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.273582 master-0 kubenswrapper[4172]: I0307 21:14:21.273532 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 21:14:21.274354 master-0 kubenswrapper[4172]: I0307 21:14:21.274311 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-cb227"] Mar 07 21:14:21.276055 master-0 kubenswrapper[4172]: I0307 21:14:21.275677 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh"] Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.277576 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.277880 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk"] Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.278580 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp"] Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.278960 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.279511 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.281519 master-0 kubenswrapper[4172]: I0307 21:14:21.281480 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:14:21.299200 master-0 kubenswrapper[4172]: I0307 21:14:21.299139 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 21:14:21.299443 master-0 kubenswrapper[4172]: I0307 21:14:21.299273 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.299537 master-0 kubenswrapper[4172]: I0307 21:14:21.299513 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.304493 master-0 kubenswrapper[4172]: I0307 21:14:21.300258 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-wqqqr"] Mar 07 21:14:21.304493 master-0 kubenswrapper[4172]: I0307 21:14:21.301221 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5"] Mar 07 21:14:21.304493 master-0 kubenswrapper[4172]: I0307 21:14:21.301943 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:21.304493 master-0 kubenswrapper[4172]: I0307 21:14:21.303886 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.304493 master-0 kubenswrapper[4172]: I0307 21:14:21.304436 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.305165 master-0 kubenswrapper[4172]: I0307 21:14:21.304944 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 21:14:21.305733 master-0 kubenswrapper[4172]: I0307 21:14:21.305667 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 21:14:21.305899 master-0 kubenswrapper[4172]: I0307 21:14:21.305874 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 21:14:21.306368 master-0 kubenswrapper[4172]: I0307 21:14:21.306332 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 21:14:21.306606 master-0 kubenswrapper[4172]: I0307 21:14:21.306559 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 21:14:21.306706 master-0 kubenswrapper[4172]: I0307 21:14:21.306660 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b"] Mar 07 21:14:21.307024 master-0 kubenswrapper[4172]: I0307 21:14:21.306991 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 21:14:21.307261 master-0 kubenswrapper[4172]: I0307 21:14:21.307223 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 21:14:21.307715 master-0 kubenswrapper[4172]: I0307 21:14:21.307663 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.307993 master-0 kubenswrapper[4172]: I0307 21:14:21.307952 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.309136 master-0 kubenswrapper[4172]: I0307 21:14:21.308499 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 21:14:21.309136 master-0 kubenswrapper[4172]: I0307 21:14:21.308586 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 21:14:21.309136 master-0 kubenswrapper[4172]: I0307 21:14:21.309086 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 21:14:21.309362 master-0 kubenswrapper[4172]: I0307 21:14:21.309344 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:14:21.309425 master-0 kubenswrapper[4172]: I0307 21:14:21.309370 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.309552 master-0 kubenswrapper[4172]: I0307 21:14:21.309531 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.309552 master-0 kubenswrapper[4172]: I0307 21:14:21.309548 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.309703 master-0 kubenswrapper[4172]: I0307 21:14:21.309666 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 21:14:21.309750 master-0 kubenswrapper[4172]: I0307 21:14:21.309730 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.309830 master-0 kubenswrapper[4172]: I0307 21:14:21.309792 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 21:14:21.309866 master-0 kubenswrapper[4172]: I0307 21:14:21.309853 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 21:14:21.309934 master-0 kubenswrapper[4172]: I0307 21:14:21.309896 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:14:21.309983 master-0 kubenswrapper[4172]: I0307 21:14:21.309963 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 21:14:21.310018 master-0 kubenswrapper[4172]: I0307 21:14:21.309994 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 21:14:21.310130 master-0 kubenswrapper[4172]: I0307 21:14:21.310097 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 07 21:14:21.310748 master-0 kubenswrapper[4172]: I0307 21:14:21.310460 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.310748 master-0 kubenswrapper[4172]: I0307 21:14:21.310641 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 21:14:21.311214 master-0 kubenswrapper[4172]: I0307 21:14:21.311186 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.311340 master-0 kubenswrapper[4172]: I0307 21:14:21.311308 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 21:14:21.311521 master-0 kubenswrapper[4172]: I0307 21:14:21.311486 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.311780 master-0 kubenswrapper[4172]: I0307 21:14:21.311752 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 21:14:21.311945 master-0 kubenswrapper[4172]: I0307 21:14:21.311888 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 21:14:21.312034 master-0 kubenswrapper[4172]: I0307 21:14:21.312011 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 21:14:21.312196 master-0 kubenswrapper[4172]: I0307 21:14:21.312139 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 07 21:14:21.312395 master-0 kubenswrapper[4172]: I0307 21:14:21.312363 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 07 21:14:21.312443 master-0 kubenswrapper[4172]: I0307 21:14:21.312414 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 21:14:21.312521 master-0 kubenswrapper[4172]: I0307 21:14:21.311928 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 21:14:21.312634 master-0 kubenswrapper[4172]: I0307 21:14:21.312509 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 21:14:21.312771 master-0 kubenswrapper[4172]: I0307 21:14:21.312742 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 07 21:14:21.313198 master-0 kubenswrapper[4172]: I0307 21:14:21.313145 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 07 21:14:21.313615 master-0 kubenswrapper[4172]: I0307 21:14:21.313582 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 07 21:14:21.324023 master-0 kubenswrapper[4172]: I0307 21:14:21.322986 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr"] Mar 07 21:14:21.324023 master-0 kubenswrapper[4172]: I0307 21:14:21.323666 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.326316 master-0 kubenswrapper[4172]: I0307 21:14:21.326278 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 21:14:21.327512 master-0 kubenswrapper[4172]: I0307 21:14:21.327477 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd"] Mar 07 21:14:21.328371 master-0 kubenswrapper[4172]: I0307 21:14:21.328343 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg"] Mar 07 21:14:21.329138 master-0 kubenswrapper[4172]: I0307 21:14:21.329103 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m"] Mar 07 21:14:21.330783 master-0 kubenswrapper[4172]: I0307 21:14:21.330735 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 21:14:21.330887 master-0 kubenswrapper[4172]: I0307 21:14:21.330848 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f"] Mar 07 21:14:21.331119 master-0 kubenswrapper[4172]: I0307 21:14:21.331091 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:14:21.331152 master-0 kubenswrapper[4172]: I0307 21:14:21.331116 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q"] Mar 07 21:14:21.331269 master-0 kubenswrapper[4172]: I0307 21:14:21.331240 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 21:14:21.331425 master-0 kubenswrapper[4172]: I0307 21:14:21.331388 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 07 21:14:21.331528 master-0 kubenswrapper[4172]: I0307 21:14:21.331496 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 07 21:14:21.331625 master-0 kubenswrapper[4172]: I0307 21:14:21.331605 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 07 21:14:21.331810 master-0 kubenswrapper[4172]: I0307 21:14:21.331792 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 21:14:21.331959 master-0 kubenswrapper[4172]: I0307 21:14:21.331508 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 07 21:14:21.332077 master-0 kubenswrapper[4172]: I0307 21:14:21.332058 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 21:14:21.332165 master-0 kubenswrapper[4172]: I0307 21:14:21.332145 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz"] Mar 07 21:14:21.332308 master-0 kubenswrapper[4172]: I0307 21:14:21.332284 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 07 21:14:21.332874 master-0 kubenswrapper[4172]: I0307 21:14:21.332809 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft"] Mar 07 21:14:21.333501 master-0 kubenswrapper[4172]: I0307 21:14:21.333477 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-tklw9"] Mar 07 21:14:21.334210 master-0 kubenswrapper[4172]: I0307 21:14:21.334182 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf"] Mar 07 21:14:21.335363 master-0 kubenswrapper[4172]: I0307 21:14:21.335327 4172 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-n8nz9"] Mar 07 21:14:21.336473 master-0 kubenswrapper[4172]: I0307 21:14:21.336425 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.337556 master-0 kubenswrapper[4172]: I0307 21:14:21.337527 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-cb227"] Mar 07 21:14:21.339478 master-0 kubenswrapper[4172]: I0307 21:14:21.339445 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk"] Mar 07 21:14:21.339789 master-0 kubenswrapper[4172]: I0307 21:14:21.339760 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 21:14:21.345192 master-0 kubenswrapper[4172]: I0307 21:14:21.345143 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:14:21.346241 master-0 kubenswrapper[4172]: I0307 21:14:21.346193 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b"] Mar 07 21:14:21.346947 master-0 kubenswrapper[4172]: I0307 21:14:21.346912 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-wqqqr"] Mar 07 21:14:21.347647 master-0 kubenswrapper[4172]: I0307 21:14:21.347616 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh"] Mar 07 21:14:21.348817 master-0 kubenswrapper[4172]: I0307 21:14:21.348788 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6"] Mar 07 21:14:21.349194 master-0 kubenswrapper[4172]: I0307 21:14:21.349163 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz"] Mar 07 21:14:21.350002 master-0 kubenswrapper[4172]: I0307 21:14:21.349967 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5"] Mar 07 21:14:21.350710 master-0 kubenswrapper[4172]: I0307 21:14:21.350656 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x"] Mar 07 21:14:21.352269 master-0 kubenswrapper[4172]: I0307 21:14:21.352230 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp"] Mar 07 21:14:21.353506 master-0 kubenswrapper[4172]: I0307 21:14:21.353464 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh"] Mar 07 21:14:21.355084 master-0 kubenswrapper[4172]: I0307 21:14:21.355046 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr"] Mar 07 21:14:21.357094 master-0 kubenswrapper[4172]: I0307 21:14:21.357051 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h"] Mar 07 21:14:21.362148 master-0 kubenswrapper[4172]: I0307 21:14:21.362113 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.362257 master-0 kubenswrapper[4172]: I0307 21:14:21.362237 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.362353 master-0 kubenswrapper[4172]: I0307 21:14:21.362335 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.362428 master-0 kubenswrapper[4172]: I0307 21:14:21.362415 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.362509 master-0 kubenswrapper[4172]: I0307 21:14:21.362494 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.362585 master-0 kubenswrapper[4172]: I0307 21:14:21.362571 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.362667 master-0 kubenswrapper[4172]: I0307 21:14:21.362655 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.362767 master-0 kubenswrapper[4172]: I0307 21:14:21.362752 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.362840 master-0 kubenswrapper[4172]: I0307 21:14:21.362827 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.362916 master-0 kubenswrapper[4172]: I0307 21:14:21.362903 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.362986 master-0 kubenswrapper[4172]: I0307 21:14:21.362974 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.363064 master-0 kubenswrapper[4172]: I0307 21:14:21.363051 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.363135 master-0 kubenswrapper[4172]: I0307 21:14:21.363122 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.363289 master-0 kubenswrapper[4172]: I0307 21:14:21.363251 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.363345 master-0 kubenswrapper[4172]: I0307 21:14:21.363299 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.363345 master-0 kubenswrapper[4172]: I0307 21:14:21.363334 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.363400 master-0 kubenswrapper[4172]: I0307 21:14:21.363357 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.363400 master-0 kubenswrapper[4172]: I0307 21:14:21.363383 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.363460 master-0 kubenswrapper[4172]: I0307 21:14:21.363405 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.363460 master-0 kubenswrapper[4172]: I0307 21:14:21.363447 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.363514 master-0 kubenswrapper[4172]: I0307 21:14:21.363469 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.363514 master-0 kubenswrapper[4172]: I0307 21:14:21.363493 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.363581 master-0 kubenswrapper[4172]: I0307 21:14:21.363548 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.363581 master-0 kubenswrapper[4172]: I0307 21:14:21.363573 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.363635 master-0 kubenswrapper[4172]: I0307 21:14:21.363594 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.363635 master-0 kubenswrapper[4172]: I0307 21:14:21.363618 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.363705 master-0 kubenswrapper[4172]: I0307 21:14:21.363642 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.363705 master-0 kubenswrapper[4172]: I0307 21:14:21.363663 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.363705 master-0 kubenswrapper[4172]: I0307 21:14:21.363701 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.363791 master-0 kubenswrapper[4172]: I0307 21:14:21.363739 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.363791 master-0 kubenswrapper[4172]: I0307 21:14:21.363764 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.363791 master-0 kubenswrapper[4172]: I0307 21:14:21.363785 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.363874 master-0 kubenswrapper[4172]: I0307 21:14:21.363806 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.363874 master-0 kubenswrapper[4172]: I0307 21:14:21.363827 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.363874 master-0 kubenswrapper[4172]: I0307 21:14:21.363861 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.363974 master-0 kubenswrapper[4172]: I0307 21:14:21.363908 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.364009 master-0 kubenswrapper[4172]: I0307 21:14:21.363974 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.364073 master-0 kubenswrapper[4172]: I0307 21:14:21.364043 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.364108 master-0 kubenswrapper[4172]: I0307 21:14:21.364088 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.364152 master-0 kubenswrapper[4172]: I0307 21:14:21.364129 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.364182 master-0 kubenswrapper[4172]: I0307 21:14:21.364165 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.364219 master-0 kubenswrapper[4172]: I0307 21:14:21.364194 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.364248 master-0 kubenswrapper[4172]: I0307 21:14:21.364223 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.364295 master-0 kubenswrapper[4172]: I0307 21:14:21.364269 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.364326 master-0 kubenswrapper[4172]: I0307 21:14:21.364308 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.364359 master-0 kubenswrapper[4172]: I0307 21:14:21.364338 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.364387 master-0 kubenswrapper[4172]: I0307 21:14:21.364366 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.364418 master-0 kubenswrapper[4172]: I0307 21:14:21.364396 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.364447 master-0 kubenswrapper[4172]: I0307 21:14:21.364432 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.364490 master-0 kubenswrapper[4172]: I0307 21:14:21.364464 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.364527 master-0 kubenswrapper[4172]: I0307 21:14:21.364505 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.364556 master-0 kubenswrapper[4172]: I0307 21:14:21.364534 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.364585 master-0 kubenswrapper[4172]: I0307 21:14:21.364570 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.364615 master-0 kubenswrapper[4172]: I0307 21:14:21.364601 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.364652 master-0 kubenswrapper[4172]: I0307 21:14:21.364633 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.364712 master-0 kubenswrapper[4172]: I0307 21:14:21.364666 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.364748 master-0 kubenswrapper[4172]: I0307 21:14:21.364727 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.364785 master-0 kubenswrapper[4172]: I0307 21:14:21.364761 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.364817 master-0 kubenswrapper[4172]: I0307 21:14:21.364796 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.465670 master-0 kubenswrapper[4172]: I0307 21:14:21.465609 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.465818 master-0 kubenswrapper[4172]: I0307 21:14:21.465712 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.465818 master-0 kubenswrapper[4172]: I0307 21:14:21.465759 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.465818 master-0 kubenswrapper[4172]: I0307 21:14:21.465798 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.465906 master-0 kubenswrapper[4172]: I0307 21:14:21.465837 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.465906 master-0 kubenswrapper[4172]: I0307 21:14:21.465873 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.465961 master-0 kubenswrapper[4172]: I0307 21:14:21.465907 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.465961 master-0 kubenswrapper[4172]: I0307 21:14:21.465942 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.466014 master-0 kubenswrapper[4172]: I0307 21:14:21.465981 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.466042 master-0 kubenswrapper[4172]: I0307 21:14:21.466016 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.466074 master-0 kubenswrapper[4172]: I0307 21:14:21.466050 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.466105 master-0 kubenswrapper[4172]: I0307 21:14:21.466081 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.466137 master-0 kubenswrapper[4172]: I0307 21:14:21.466117 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.466165 master-0 kubenswrapper[4172]: I0307 21:14:21.466152 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.466216 master-0 kubenswrapper[4172]: I0307 21:14:21.466186 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.466256 master-0 kubenswrapper[4172]: I0307 21:14:21.466228 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.466287 master-0 kubenswrapper[4172]: I0307 21:14:21.466259 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.466316 master-0 kubenswrapper[4172]: I0307 21:14:21.466289 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.466343 master-0 kubenswrapper[4172]: I0307 21:14:21.466323 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.466391 master-0 kubenswrapper[4172]: I0307 21:14:21.466361 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.466815 master-0 kubenswrapper[4172]: I0307 21:14:21.466767 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.467221 master-0 kubenswrapper[4172]: E0307 21:14:21.467185 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:21.467282 master-0 kubenswrapper[4172]: E0307 21:14:21.467272 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.967246211 +0000 UTC m=+112.639664118 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:21.467388 master-0 kubenswrapper[4172]: E0307 21:14:21.467362 4172 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:21.467527 master-0 kubenswrapper[4172]: E0307 21:14:21.467498 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.967477357 +0000 UTC m=+112.639895254 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:21.468194 master-0 kubenswrapper[4172]: I0307 21:14:21.468136 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.468409 master-0 kubenswrapper[4172]: I0307 21:14:21.468262 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.468409 master-0 kubenswrapper[4172]: I0307 21:14:21.468350 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.468957 master-0 kubenswrapper[4172]: I0307 21:14:21.468900 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.469006 master-0 kubenswrapper[4172]: I0307 21:14:21.468969 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.469006 master-0 kubenswrapper[4172]: I0307 21:14:21.468997 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.469078 master-0 kubenswrapper[4172]: I0307 21:14:21.469022 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.469078 master-0 kubenswrapper[4172]: I0307 21:14:21.469044 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.469078 master-0 kubenswrapper[4172]: I0307 21:14:21.469070 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.469158 master-0 kubenswrapper[4172]: I0307 21:14:21.469096 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.469158 master-0 kubenswrapper[4172]: I0307 21:14:21.469119 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.469158 master-0 kubenswrapper[4172]: I0307 21:14:21.469149 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.469240 master-0 kubenswrapper[4172]: I0307 21:14:21.469170 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.469240 master-0 kubenswrapper[4172]: I0307 21:14:21.469191 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.469240 master-0 kubenswrapper[4172]: I0307 21:14:21.469217 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.469240 master-0 kubenswrapper[4172]: I0307 21:14:21.469241 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469265 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469289 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469310 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469344 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469369 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469388 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469385 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469414 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.469427 master-0 kubenswrapper[4172]: I0307 21:14:21.469433 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469460 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469483 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469503 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469522 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469529 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469548 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469577 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469595 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469615 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469635 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469659 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469694 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469717 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469736 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.469780 master-0 kubenswrapper[4172]: I0307 21:14:21.469753 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469772 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469787 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469810 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469832 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469864 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469884 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469904 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469924 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469932 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.470184 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.468903 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.470296 master-0 kubenswrapper[4172]: I0307 21:14:21.469949 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.470806 master-0 kubenswrapper[4172]: E0307 21:14:21.470516 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:21.470806 master-0 kubenswrapper[4172]: E0307 21:14:21.470561 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.970543987 +0000 UTC m=+112.642961884 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:21.470806 master-0 kubenswrapper[4172]: I0307 21:14:21.470658 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.472231 master-0 kubenswrapper[4172]: I0307 21:14:21.472197 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.472311 master-0 kubenswrapper[4172]: I0307 21:14:21.472290 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.472367 master-0 kubenswrapper[4172]: I0307 21:14:21.472332 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.472408 master-0 kubenswrapper[4172]: I0307 21:14:21.472370 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.472408 master-0 kubenswrapper[4172]: I0307 21:14:21.472409 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.472549 master-0 kubenswrapper[4172]: I0307 21:14:21.472447 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.472549 master-0 kubenswrapper[4172]: I0307 21:14:21.472478 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.472549 master-0 kubenswrapper[4172]: I0307 21:14:21.472512 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.472656 master-0 kubenswrapper[4172]: I0307 21:14:21.472571 4172 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.473182 master-0 kubenswrapper[4172]: I0307 21:14:21.473141 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.473543 master-0 kubenswrapper[4172]: I0307 21:14:21.473488 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.473726 master-0 kubenswrapper[4172]: I0307 21:14:21.473702 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.474342 master-0 kubenswrapper[4172]: E0307 21:14:21.474321 4172 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:21.474386 master-0 kubenswrapper[4172]: E0307 21:14:21.474363 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.974353876 +0000 UTC m=+112.646771773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:21.474954 master-0 kubenswrapper[4172]: I0307 21:14:21.474918 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.475295 master-0 kubenswrapper[4172]: I0307 21:14:21.475250 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.475912 master-0 kubenswrapper[4172]: I0307 21:14:21.475875 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.476538 master-0 kubenswrapper[4172]: I0307 21:14:21.476499 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.476583 master-0 kubenswrapper[4172]: I0307 21:14:21.476504 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.476645 master-0 kubenswrapper[4172]: E0307 21:14:21.476591 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:21.476729 master-0 kubenswrapper[4172]: E0307 21:14:21.476671 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.976641586 +0000 UTC m=+112.649059513 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:21.477464 master-0 kubenswrapper[4172]: I0307 21:14:21.477429 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.478651 master-0 kubenswrapper[4172]: I0307 21:14:21.478569 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.479795 master-0 kubenswrapper[4172]: I0307 21:14:21.479163 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.479795 master-0 kubenswrapper[4172]: I0307 21:14:21.479432 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.479795 master-0 kubenswrapper[4172]: I0307 21:14:21.479653 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.480527 master-0 kubenswrapper[4172]: E0307 21:14:21.480421 4172 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:21.480527 master-0 kubenswrapper[4172]: E0307 21:14:21.480499 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.980479066 +0000 UTC m=+112.652896963 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:21.480783 master-0 kubenswrapper[4172]: E0307 21:14:21.480702 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:21.480783 master-0 kubenswrapper[4172]: E0307 21:14:21.480735 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.980726623 +0000 UTC m=+112.653144520 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:21.483238 master-0 kubenswrapper[4172]: E0307 21:14:21.482240 4172 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:21.483238 master-0 kubenswrapper[4172]: E0307 21:14:21.482446 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:21.483238 master-0 kubenswrapper[4172]: E0307 21:14:21.482481 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.982471858 +0000 UTC m=+112.654889755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:21.483238 master-0 kubenswrapper[4172]: I0307 21:14:21.483196 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.483383 master-0 kubenswrapper[4172]: E0307 21:14:21.483272 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:21.983240027 +0000 UTC m=+112.655657944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:21.484538 master-0 kubenswrapper[4172]: I0307 21:14:21.484487 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.484538 master-0 kubenswrapper[4172]: I0307 21:14:21.484520 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.484760 master-0 kubenswrapper[4172]: I0307 21:14:21.484720 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.485309 master-0 kubenswrapper[4172]: I0307 21:14:21.485261 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.486819 master-0 kubenswrapper[4172]: I0307 21:14:21.486751 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.496110 master-0 kubenswrapper[4172]: I0307 21:14:21.490110 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.496110 master-0 kubenswrapper[4172]: I0307 21:14:21.490652 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.496110 master-0 kubenswrapper[4172]: I0307 21:14:21.493857 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.496914 master-0 kubenswrapper[4172]: I0307 21:14:21.496871 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.497043 master-0 kubenswrapper[4172]: I0307 21:14:21.497018 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.500251 master-0 kubenswrapper[4172]: I0307 21:14:21.500204 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.500884 master-0 kubenswrapper[4172]: I0307 21:14:21.500859 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.500949 master-0 kubenswrapper[4172]: I0307 21:14:21.500861 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.503584 master-0 kubenswrapper[4172]: I0307 21:14:21.503544 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.503967 master-0 kubenswrapper[4172]: I0307 21:14:21.503922 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.505585 master-0 kubenswrapper[4172]: I0307 21:14:21.505553 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.508474 master-0 kubenswrapper[4172]: I0307 21:14:21.508426 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.510524 master-0 kubenswrapper[4172]: I0307 21:14:21.510495 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.513088 master-0 kubenswrapper[4172]: I0307 21:14:21.513057 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.520531 master-0 kubenswrapper[4172]: I0307 21:14:21.520499 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.544495 master-0 kubenswrapper[4172]: I0307 21:14:21.544450 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.561092 master-0 kubenswrapper[4172]: I0307 21:14:21.561053 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.565166 master-0 kubenswrapper[4172]: I0307 21:14:21.565132 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:21.573323 master-0 kubenswrapper[4172]: I0307 21:14:21.573272 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.573374 master-0 kubenswrapper[4172]: I0307 21:14:21.573340 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.573695 master-0 kubenswrapper[4172]: E0307 21:14:21.573624 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:21.573853 master-0 kubenswrapper[4172]: I0307 21:14:21.573638 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.573897 master-0 kubenswrapper[4172]: E0307 21:14:21.573830 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.073768477 +0000 UTC m=+112.746186384 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:21.573897 master-0 kubenswrapper[4172]: I0307 21:14:21.573888 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.573981 master-0 kubenswrapper[4172]: I0307 21:14:21.573930 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.573981 master-0 kubenswrapper[4172]: I0307 21:14:21.573957 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.574032 master-0 kubenswrapper[4172]: I0307 21:14:21.573991 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.574032 master-0 kubenswrapper[4172]: I0307 21:14:21.574010 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.574154 master-0 kubenswrapper[4172]: I0307 21:14:21.574129 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.574347 master-0 kubenswrapper[4172]: E0307 21:14:21.574322 4172 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:21.574389 master-0 kubenswrapper[4172]: E0307 21:14:21.574378 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.074361182 +0000 UTC m=+112.746779089 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:21.574563 master-0 kubenswrapper[4172]: E0307 21:14:21.574538 4172 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:21.574623 master-0 kubenswrapper[4172]: E0307 21:14:21.574601 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.074580319 +0000 UTC m=+112.746998216 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:21.574748 master-0 kubenswrapper[4172]: I0307 21:14:21.574726 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.575111 master-0 kubenswrapper[4172]: I0307 21:14:21.575052 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.575158 master-0 kubenswrapper[4172]: I0307 21:14:21.575083 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.575187 master-0 kubenswrapper[4172]: I0307 21:14:21.575176 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.575219 master-0 kubenswrapper[4172]: I0307 21:14:21.575207 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.575278 master-0 kubenswrapper[4172]: I0307 21:14:21.575253 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.575431 master-0 kubenswrapper[4172]: I0307 21:14:21.575410 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:21.575497 master-0 kubenswrapper[4172]: I0307 21:14:21.575481 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.576091 master-0 kubenswrapper[4172]: E0307 21:14:21.576072 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:21.576158 master-0 kubenswrapper[4172]: I0307 21:14:21.576075 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.576158 master-0 kubenswrapper[4172]: E0307 21:14:21.576129 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.076109659 +0000 UTC m=+112.748527566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:21.576401 master-0 kubenswrapper[4172]: I0307 21:14:21.576364 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.576779 master-0 kubenswrapper[4172]: I0307 21:14:21.576732 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.577136 master-0 kubenswrapper[4172]: I0307 21:14:21.577087 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.577785 master-0 kubenswrapper[4172]: I0307 21:14:21.577749 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.586357 master-0 kubenswrapper[4172]: I0307 21:14:21.586322 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.607319 master-0 kubenswrapper[4172]: I0307 21:14:21.607258 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.613465 master-0 kubenswrapper[4172]: I0307 21:14:21.613419 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:21.631751 master-0 kubenswrapper[4172]: I0307 21:14:21.631721 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:21.644059 master-0 kubenswrapper[4172]: I0307 21:14:21.644004 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.668819 master-0 kubenswrapper[4172]: I0307 21:14:21.668760 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:21.677752 master-0 kubenswrapper[4172]: I0307 21:14:21.677380 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.688247 master-0 kubenswrapper[4172]: I0307 21:14:21.688187 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.701852 master-0 kubenswrapper[4172]: I0307 21:14:21.701773 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:21.713065 master-0 kubenswrapper[4172]: I0307 21:14:21.712316 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.725893 master-0 kubenswrapper[4172]: I0307 21:14:21.725802 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:21.741264 master-0 kubenswrapper[4172]: I0307 21:14:21.741174 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:21.749598 master-0 kubenswrapper[4172]: I0307 21:14:21.749454 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:21.753472 master-0 kubenswrapper[4172]: I0307 21:14:21.753417 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:21.756842 master-0 kubenswrapper[4172]: I0307 21:14:21.756780 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:21.768535 master-0 kubenswrapper[4172]: I0307 21:14:21.766126 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:21.768535 master-0 kubenswrapper[4172]: I0307 21:14:21.767414 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:21.782367 master-0 kubenswrapper[4172]: I0307 21:14:21.782252 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:21.809658 master-0 kubenswrapper[4172]: I0307 21:14:21.809065 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.832197 master-0 kubenswrapper[4172]: I0307 21:14:21.831888 4172 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.846254 master-0 kubenswrapper[4172]: I0307 21:14:21.846003 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m"] Mar 07 21:14:21.864331 master-0 kubenswrapper[4172]: I0307 21:14:21.863057 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:21.865009 master-0 kubenswrapper[4172]: I0307 21:14:21.864974 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6"] Mar 07 21:14:21.883504 master-0 kubenswrapper[4172]: I0307 21:14:21.883315 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:21.883829 master-0 kubenswrapper[4172]: W0307 21:14:21.883773 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3faedef9_d507_48aa_82a8_f3dc9b5adeef.slice/crio-90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7 WatchSource:0}: Error finding container 90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7: Status 404 returned error can't find the container with id 90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7 Mar 07 21:14:21.901379 master-0 kubenswrapper[4172]: I0307 21:14:21.900575 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" event={"ID":"3faedef9-d507-48aa-82a8-f3dc9b5adeef","Type":"ContainerStarted","Data":"90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7"} Mar 07 21:14:21.907970 master-0 kubenswrapper[4172]: I0307 21:14:21.906005 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz"] Mar 07 21:14:21.915768 master-0 kubenswrapper[4172]: I0307 21:14:21.915198 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:21.942239 master-0 kubenswrapper[4172]: I0307 21:14:21.942191 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:21.956457 master-0 kubenswrapper[4172]: W0307 21:14:21.956332 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b339e6a_cae6_416a_963b_2fd23cecba96.slice/crio-d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d WatchSource:0}: Error finding container d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d: Status 404 returned error can't find the container with id d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d Mar 07 21:14:21.975917 master-0 kubenswrapper[4172]: I0307 21:14:21.974809 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz"] Mar 07 21:14:21.976481 master-0 kubenswrapper[4172]: I0307 21:14:21.976410 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf"] Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984428 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984476 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984504 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984574 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984608 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984642 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984669 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984724 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: I0307 21:14:21.984750 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.984929 4172 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.985011 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.984968745 +0000 UTC m=+113.657386642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.985235 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.985316 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985289073 +0000 UTC m=+113.657706960 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.985362 4172 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:21.985776 master-0 kubenswrapper[4172]: E0307 21:14:21.985387 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985380355 +0000 UTC m=+113.657798252 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985449 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985485 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985474528 +0000 UTC m=+113.657892625 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985505 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985531 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985524499 +0000 UTC m=+113.657942396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985550 4172 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985571 4172 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985585 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.98557547 +0000 UTC m=+113.657993577 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985607 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985597091 +0000 UTC m=+113.658015198 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985450 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985619 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985641 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985633522 +0000 UTC m=+113.658051419 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:21.986244 master-0 kubenswrapper[4172]: E0307 21:14:21.985658 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:22.985648662 +0000 UTC m=+113.658066559 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:21.992481 master-0 kubenswrapper[4172]: W0307 21:14:21.992427 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff7c5ff2_49d2_4a55_96d1_5244ae8ad602.slice/crio-cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960 WatchSource:0}: Error finding container cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960: Status 404 returned error can't find the container with id cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960 Mar 07 21:14:22.030232 master-0 kubenswrapper[4172]: W0307 21:14:22.030159 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode543d99f_e0dc_49be_95bd_c39eabd05ce8.slice/crio-8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601 WatchSource:0}: Error finding container 8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601: Status 404 returned error can't find the container with id 8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601 Mar 07 21:14:22.046948 master-0 kubenswrapper[4172]: I0307 21:14:22.033233 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd"] Mar 07 21:14:22.051974 master-0 kubenswrapper[4172]: I0307 21:14:22.051938 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp"] Mar 07 21:14:22.088295 master-0 kubenswrapper[4172]: I0307 21:14:22.088245 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:22.088412 master-0 kubenswrapper[4172]: I0307 21:14:22.088319 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:22.088412 master-0 kubenswrapper[4172]: I0307 21:14:22.088348 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:22.088489 master-0 kubenswrapper[4172]: I0307 21:14:22.088429 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:22.088631 master-0 kubenswrapper[4172]: E0307 21:14:22.088602 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:22.088701 master-0 kubenswrapper[4172]: E0307 21:14:22.088671 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:23.088654137 +0000 UTC m=+113.761072034 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:22.088757 master-0 kubenswrapper[4172]: E0307 21:14:22.088735 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:22.088757 master-0 kubenswrapper[4172]: E0307 21:14:22.088757 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:23.08875061 +0000 UTC m=+113.761168497 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:22.088934 master-0 kubenswrapper[4172]: E0307 21:14:22.088806 4172 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:22.088934 master-0 kubenswrapper[4172]: E0307 21:14:22.088823 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:23.088818592 +0000 UTC m=+113.761236489 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:22.088934 master-0 kubenswrapper[4172]: E0307 21:14:22.088859 4172 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:22.088934 master-0 kubenswrapper[4172]: E0307 21:14:22.088878 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:23.088871103 +0000 UTC m=+113.761289000 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:22.091339 master-0 kubenswrapper[4172]: I0307 21:14:22.091306 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h"] Mar 07 21:14:22.092880 master-0 kubenswrapper[4172]: W0307 21:14:22.092841 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd633b72_3d0b_4601_a2c2_3f487d943b35.slice/crio-08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84 WatchSource:0}: Error finding container 08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84: Status 404 returned error can't find the container with id 08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84 Mar 07 21:14:22.105877 master-0 kubenswrapper[4172]: W0307 21:14:22.105377 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8269652e_360f_43ef_9e7d_473c5f478275.slice/crio-f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740 WatchSource:0}: Error finding container f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740: Status 404 returned error can't find the container with id f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740 Mar 07 21:14:22.106821 master-0 kubenswrapper[4172]: I0307 21:14:22.106617 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk"] Mar 07 21:14:22.178017 master-0 kubenswrapper[4172]: I0307 21:14:22.177770 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-64488f9d78-cb227"] Mar 07 21:14:22.178864 master-0 kubenswrapper[4172]: I0307 21:14:22.178820 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5"] Mar 07 21:14:22.199377 master-0 kubenswrapper[4172]: W0307 21:14:22.193113 4172 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29624e4f_d970_4dfa_a8f1_515b73397c8f.slice/crio-0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de WatchSource:0}: Error finding container 0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de: Status 404 returned error can't find the container with id 0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de Mar 07 21:14:22.225333 master-0 kubenswrapper[4172]: I0307 21:14:22.221967 4172 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b"] Mar 07 21:14:22.255166 master-0 kubenswrapper[4172]: E0307 21:14:22.255026 4172 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-controller-manager-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56,Command:[cluster-kube-controller-manager-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56,ValueFrom:nil,},EnvVar{Name:CLUSTER_POLICY_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609,ValueFrom:nil,},EnvVar{Name:TOOLS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35768a0c3eb24134dd38633e8acfc7db69ee96b2fd660e9bba3b8c996452fef7,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:1.31.14,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod kube-controller-manager-operator-86d7cdfdfb-wb26b_openshift-kube-controller-manager-operator(abfb5602-7255-43d7-a510-e7f94885887e): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 07 21:14:22.256528 master-0 kubenswrapper[4172]: E0307 21:14:22.256458 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ErrImagePull: \"pull QPS exceeded\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" podUID="abfb5602-7255-43d7-a510-e7f94885887e" Mar 07 21:14:22.280730 master-0 kubenswrapper[4172]: I0307 21:14:22.280218 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:22.281382 master-0 kubenswrapper[4172]: I0307 21:14:22.280978 4172 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:22.306609 master-0 kubenswrapper[4172]: I0307 21:14:22.306567 4172 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 21:14:22.326429 master-0 kubenswrapper[4172]: I0307 21:14:22.326393 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 21:14:22.347101 master-0 kubenswrapper[4172]: I0307 21:14:22.347071 4172 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 21:14:22.906638 master-0 kubenswrapper[4172]: I0307 21:14:22.906571 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" event={"ID":"b88c5fbe-e19f-45b3-ab03-e1626f95776d","Type":"ContainerStarted","Data":"41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828"} Mar 07 21:14:22.907796 master-0 kubenswrapper[4172]: I0307 21:14:22.907768 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerStarted","Data":"831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06"} Mar 07 21:14:22.910032 master-0 kubenswrapper[4172]: E0307 21:14:22.909818 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" podUID="abfb5602-7255-43d7-a510-e7f94885887e" Mar 07 21:14:22.910630 master-0 kubenswrapper[4172]: I0307 21:14:22.910597 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" event={"ID":"5f82d4aa-0cb5-477f-944e-745a21d124fc","Type":"ContainerStarted","Data":"aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648"} Mar 07 21:14:22.912489 master-0 kubenswrapper[4172]: I0307 21:14:22.912410 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerStarted","Data":"e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a"} Mar 07 21:14:22.914320 master-0 kubenswrapper[4172]: I0307 21:14:22.914285 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerStarted","Data":"4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91"} Mar 07 21:14:22.914320 master-0 kubenswrapper[4172]: I0307 21:14:22.914318 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerStarted","Data":"d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d"} Mar 07 21:14:22.915564 master-0 kubenswrapper[4172]: I0307 21:14:22.915540 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" event={"ID":"bd633b72-3d0b-4601-a2c2-3f487d943b35","Type":"ContainerStarted","Data":"08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84"} Mar 07 21:14:22.916642 master-0 kubenswrapper[4172]: I0307 21:14:22.916618 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerStarted","Data":"cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960"} Mar 07 21:14:22.917584 master-0 kubenswrapper[4172]: I0307 21:14:22.917554 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" event={"ID":"ab2f6566-730d-46f5-92ed-79e3039d24e8","Type":"ContainerStarted","Data":"8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372"} Mar 07 21:14:22.918532 master-0 kubenswrapper[4172]: I0307 21:14:22.918501 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerStarted","Data":"0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de"} Mar 07 21:14:22.919322 master-0 kubenswrapper[4172]: I0307 21:14:22.919293 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n8nz9" event={"ID":"666475e5-df4b-44ef-a2d4-39d84ab91aad","Type":"ContainerStarted","Data":"f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c"} Mar 07 21:14:22.920037 master-0 kubenswrapper[4172]: I0307 21:14:22.920009 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerStarted","Data":"f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740"} Mar 07 21:14:22.920887 master-0 kubenswrapper[4172]: I0307 21:14:22.920863 4172 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" event={"ID":"e543d99f-e0dc-49be-95bd-c39eabd05ce8","Type":"ContainerStarted","Data":"8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601"} Mar 07 21:14:23.002502 master-0 kubenswrapper[4172]: I0307 21:14:23.002461 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:23.002621 master-0 kubenswrapper[4172]: I0307 21:14:23.002527 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:23.002621 master-0 kubenswrapper[4172]: I0307 21:14:23.002554 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:23.002801 master-0 kubenswrapper[4172]: E0307 21:14:23.002413 4172 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:23.002879 master-0 kubenswrapper[4172]: E0307 21:14:23.002858 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.002840525 +0000 UTC m=+115.675258422 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:23.003024 master-0 kubenswrapper[4172]: E0307 21:14:23.003004 4172 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:23.003054 master-0 kubenswrapper[4172]: E0307 21:14:23.003048 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.00303469 +0000 UTC m=+115.675452587 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:23.003321 master-0 kubenswrapper[4172]: I0307 21:14:23.003297 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:23.003394 master-0 kubenswrapper[4172]: I0307 21:14:23.003367 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:23.003511 master-0 kubenswrapper[4172]: E0307 21:14:23.003475 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:23.003541 master-0 kubenswrapper[4172]: E0307 21:14:23.003528 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.003519892 +0000 UTC m=+115.675937789 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:23.003759 master-0 kubenswrapper[4172]: E0307 21:14:23.003736 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:23.003808 master-0 kubenswrapper[4172]: E0307 21:14:23.003798 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.003782959 +0000 UTC m=+115.676200856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:23.003867 master-0 kubenswrapper[4172]: E0307 21:14:23.003839 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:23.003896 master-0 kubenswrapper[4172]: E0307 21:14:23.003890 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.003882241 +0000 UTC m=+115.676300128 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:23.003927 master-0 kubenswrapper[4172]: I0307 21:14:23.003913 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:23.003989 master-0 kubenswrapper[4172]: I0307 21:14:23.003976 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:23.004078 master-0 kubenswrapper[4172]: E0307 21:14:23.004067 4172 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:23.004109 master-0 kubenswrapper[4172]: E0307 21:14:23.004102 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.004088857 +0000 UTC m=+115.676506754 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:23.004162 master-0 kubenswrapper[4172]: E0307 21:14:23.004151 4172 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:23.004231 master-0 kubenswrapper[4172]: E0307 21:14:23.004221 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:23.004368 master-0 kubenswrapper[4172]: I0307 21:14:23.004345 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:23.004398 master-0 kubenswrapper[4172]: E0307 21:14:23.004375 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.004169059 +0000 UTC m=+115.676586956 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:23.004398 master-0 kubenswrapper[4172]: E0307 21:14:23.004390 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.004384515 +0000 UTC m=+115.676802412 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:23.004528 master-0 kubenswrapper[4172]: I0307 21:14:23.004514 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:23.004736 master-0 kubenswrapper[4172]: E0307 21:14:23.004723 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:23.004772 master-0 kubenswrapper[4172]: E0307 21:14:23.004757 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.004742874 +0000 UTC m=+115.677160761 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: I0307 21:14:23.105383 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.105519 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.105566 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.105552242 +0000 UTC m=+115.777970139 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: I0307 21:14:23.105997 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: I0307 21:14:23.106054 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.106135 4172 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.106162 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.106150407 +0000 UTC m=+115.778568304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.106289 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.106322 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.106310912 +0000 UTC m=+115.778728809 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: I0307 21:14:23.106900 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.107013 4172 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:23.164573 master-0 kubenswrapper[4172]: E0307 21:14:23.107215 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:25.107206194 +0000 UTC m=+115.779624091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:23.615669 master-0 kubenswrapper[4172]: I0307 21:14:23.611924 4172 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" podStartSLOduration=79.611879088 podStartE2EDuration="1m19.611879088s" podCreationTimestamp="2026-03-07 21:13:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:23.610751859 +0000 UTC m=+114.283169766" watchObservedRunningTime="2026-03-07 21:14:23.611879088 +0000 UTC m=+114.284296985" Mar 07 21:14:24.110005 master-0 kubenswrapper[4172]: E0307 21:14:24.109734 4172 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-controller-manager-operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"\"" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" podUID="abfb5602-7255-43d7-a510-e7f94885887e" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.093541 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094062 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094086 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094138 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094159 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094181 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094239 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094261 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: I0307 21:14:25.094282 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: E0307 21:14:25.094421 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:25.094732 master-0 kubenswrapper[4172]: E0307 21:14:25.094477 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.094460009 +0000 UTC m=+119.766877906 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:25.096064 master-0 kubenswrapper[4172]: E0307 21:14:25.095933 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:25.096064 master-0 kubenswrapper[4172]: E0307 21:14:25.095965 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.095957039 +0000 UTC m=+119.768374936 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:25.096064 master-0 kubenswrapper[4172]: E0307 21:14:25.096002 4172 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:25.096064 master-0 kubenswrapper[4172]: E0307 21:14:25.096019 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.0960131 +0000 UTC m=+119.768430997 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:25.096064 master-0 kubenswrapper[4172]: E0307 21:14:25.096051 4172 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096075 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096062152 +0000 UTC m=+119.768480049 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096114 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096132 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096126473 +0000 UTC m=+119.768544370 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096169 4172 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096185 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096180235 +0000 UTC m=+119.768598132 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096219 4172 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:25.096241 master-0 kubenswrapper[4172]: E0307 21:14:25.096235 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096230276 +0000 UTC m=+119.768648173 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:25.096426 master-0 kubenswrapper[4172]: E0307 21:14:25.096269 4172 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:25.096426 master-0 kubenswrapper[4172]: E0307 21:14:25.096287 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096282057 +0000 UTC m=+119.768699954 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:25.096426 master-0 kubenswrapper[4172]: E0307 21:14:25.096321 4172 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:25.096426 master-0 kubenswrapper[4172]: E0307 21:14:25.096340 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.096333209 +0000 UTC m=+119.768751106 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:25.195457 master-0 kubenswrapper[4172]: I0307 21:14:25.195371 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:25.195714 master-0 kubenswrapper[4172]: I0307 21:14:25.195485 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:25.195714 master-0 kubenswrapper[4172]: I0307 21:14:25.195521 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:25.195714 master-0 kubenswrapper[4172]: I0307 21:14:25.195558 4172 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:25.195848 master-0 kubenswrapper[4172]: E0307 21:14:25.195755 4172 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:25.195848 master-0 kubenswrapper[4172]: E0307 21:14:25.195804 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:25.195848 master-0 kubenswrapper[4172]: E0307 21:14:25.195815 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.195797481 +0000 UTC m=+119.868215368 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:25.196112 master-0 kubenswrapper[4172]: E0307 21:14:25.196075 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.195824922 +0000 UTC m=+119.868242819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:25.197307 master-0 kubenswrapper[4172]: E0307 21:14:25.196130 4172 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:25.197307 master-0 kubenswrapper[4172]: E0307 21:14:25.196154 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.19614724 +0000 UTC m=+119.868565137 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:25.197307 master-0 kubenswrapper[4172]: E0307 21:14:25.196205 4172 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:25.197307 master-0 kubenswrapper[4172]: E0307 21:14:25.196234 4172 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.196222982 +0000 UTC m=+119.868640879 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:26.358849 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 07 21:14:26.387973 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 07 21:14:26.388415 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 07 21:14:26.390389 master-0 systemd[1]: kubelet.service: Consumed 11.710s CPU time. Mar 07 21:14:26.407561 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 21:14:26.491513 master-0 kubenswrapper[7689]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:14:26.493403 master-0 kubenswrapper[7689]: I0307 21:14:26.491594 7689 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496825 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496868 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496874 7689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496879 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496887 7689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:14:26.496884 master-0 kubenswrapper[7689]: W0307 21:14:26.496896 7689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496905 7689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496912 7689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496918 7689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496924 7689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496930 7689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496936 7689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496942 7689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496947 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496957 7689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496962 7689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496968 7689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496973 7689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496979 7689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496985 7689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496992 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.496998 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.497004 7689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.497009 7689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:14:26.497187 master-0 kubenswrapper[7689]: W0307 21:14:26.497014 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497020 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497025 7689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497030 7689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497035 7689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497040 7689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497045 7689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497050 7689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497055 7689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497060 7689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497065 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497069 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497074 7689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497079 7689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497086 7689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497095 7689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497100 7689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497106 7689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497113 7689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:14:26.498050 master-0 kubenswrapper[7689]: W0307 21:14:26.497118 7689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497124 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497129 7689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497134 7689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497140 7689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497145 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497151 7689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497157 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497162 7689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497167 7689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497171 7689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497176 7689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497181 7689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497186 7689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497192 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497198 7689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497203 7689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497208 7689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497214 7689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497219 7689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497224 7689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:14:26.498884 master-0 kubenswrapper[7689]: W0307 21:14:26.497230 7689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497235 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497240 7689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497245 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497249 7689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497254 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497259 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: W0307 21:14:26.497265 7689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497415 7689 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497430 7689 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497442 7689 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497466 7689 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497477 7689 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497483 7689 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497492 7689 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497500 7689 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497506 7689 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497513 7689 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497520 7689 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497526 7689 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497532 7689 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497538 7689 flags.go:64] FLAG: --cgroup-root="" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497543 7689 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 21:14:26.499870 master-0 kubenswrapper[7689]: I0307 21:14:26.497548 7689 flags.go:64] FLAG: --client-ca-file="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497553 7689 flags.go:64] FLAG: --cloud-config="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497559 7689 flags.go:64] FLAG: --cloud-provider="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497565 7689 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497575 7689 flags.go:64] FLAG: --cluster-domain="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497581 7689 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497587 7689 flags.go:64] FLAG: --config-dir="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497592 7689 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497599 7689 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497608 7689 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497616 7689 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497622 7689 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497628 7689 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497634 7689 flags.go:64] FLAG: --contention-profiling="false" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497640 7689 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497645 7689 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497652 7689 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497658 7689 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497666 7689 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497672 7689 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497698 7689 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497704 7689 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497711 7689 flags.go:64] FLAG: --enable-server="true" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497717 7689 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497725 7689 flags.go:64] FLAG: --event-burst="100" Mar 07 21:14:26.500928 master-0 kubenswrapper[7689]: I0307 21:14:26.497730 7689 flags.go:64] FLAG: --event-qps="50" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497764 7689 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497770 7689 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497776 7689 flags.go:64] FLAG: --eviction-hard="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497792 7689 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497798 7689 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497805 7689 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497812 7689 flags.go:64] FLAG: --eviction-soft="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497818 7689 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497823 7689 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497829 7689 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497835 7689 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497841 7689 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497846 7689 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497852 7689 flags.go:64] FLAG: --feature-gates="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497860 7689 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497866 7689 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497872 7689 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497878 7689 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497884 7689 flags.go:64] FLAG: --healthz-port="10248" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497890 7689 flags.go:64] FLAG: --help="false" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497895 7689 flags.go:64] FLAG: --hostname-override="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497901 7689 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497918 7689 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497924 7689 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 21:14:26.502033 master-0 kubenswrapper[7689]: I0307 21:14:26.497929 7689 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497935 7689 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497941 7689 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497947 7689 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497953 7689 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497958 7689 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497964 7689 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497971 7689 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497976 7689 flags.go:64] FLAG: --kube-reserved="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497981 7689 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497987 7689 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497993 7689 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.497998 7689 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498003 7689 flags.go:64] FLAG: --lock-file="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498009 7689 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498015 7689 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498021 7689 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498035 7689 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498040 7689 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498047 7689 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498053 7689 flags.go:64] FLAG: --logging-format="text" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498059 7689 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498073 7689 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498079 7689 flags.go:64] FLAG: --manifest-url="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498085 7689 flags.go:64] FLAG: --manifest-url-header="" Mar 07 21:14:26.503070 master-0 kubenswrapper[7689]: I0307 21:14:26.498098 7689 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498104 7689 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498112 7689 flags.go:64] FLAG: --max-pods="110" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498118 7689 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498124 7689 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498132 7689 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498138 7689 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498143 7689 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498149 7689 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498154 7689 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498170 7689 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498175 7689 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498181 7689 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498186 7689 flags.go:64] FLAG: --pod-cidr="" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498192 7689 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498202 7689 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498208 7689 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498214 7689 flags.go:64] FLAG: --pods-per-core="0" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498219 7689 flags.go:64] FLAG: --port="10250" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498225 7689 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498231 7689 flags.go:64] FLAG: --provider-id="" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498236 7689 flags.go:64] FLAG: --qos-reserved="" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498242 7689 flags.go:64] FLAG: --read-only-port="10255" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498250 7689 flags.go:64] FLAG: --register-node="true" Mar 07 21:14:26.504170 master-0 kubenswrapper[7689]: I0307 21:14:26.498256 7689 flags.go:64] FLAG: --register-schedulable="true" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498263 7689 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498275 7689 flags.go:64] FLAG: --registry-burst="10" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498281 7689 flags.go:64] FLAG: --registry-qps="5" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498287 7689 flags.go:64] FLAG: --reserved-cpus="" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498293 7689 flags.go:64] FLAG: --reserved-memory="" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498302 7689 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498308 7689 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498315 7689 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498321 7689 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498326 7689 flags.go:64] FLAG: --runonce="false" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498332 7689 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498338 7689 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498344 7689 flags.go:64] FLAG: --seccomp-default="false" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498354 7689 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498359 7689 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498365 7689 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498371 7689 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498377 7689 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498383 7689 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498388 7689 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498394 7689 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498400 7689 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498406 7689 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498412 7689 flags.go:64] FLAG: --system-cgroups="" Mar 07 21:14:26.505104 master-0 kubenswrapper[7689]: I0307 21:14:26.498417 7689 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498427 7689 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498432 7689 flags.go:64] FLAG: --tls-cert-file="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498438 7689 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498446 7689 flags.go:64] FLAG: --tls-min-version="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498452 7689 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498458 7689 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498463 7689 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498469 7689 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498475 7689 flags.go:64] FLAG: --v="2" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498484 7689 flags.go:64] FLAG: --version="false" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498492 7689 flags.go:64] FLAG: --vmodule="" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498500 7689 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: I0307 21:14:26.498507 7689 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498666 7689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498689 7689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498695 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498700 7689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498706 7689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498712 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498717 7689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498725 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498729 7689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:14:26.506089 master-0 kubenswrapper[7689]: W0307 21:14:26.498735 7689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498740 7689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498745 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498749 7689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498754 7689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498758 7689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498763 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498768 7689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498773 7689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498777 7689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498782 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498787 7689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498791 7689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498795 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498800 7689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498804 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498809 7689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498813 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498818 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498822 7689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:14:26.507090 master-0 kubenswrapper[7689]: W0307 21:14:26.498828 7689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498834 7689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498840 7689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498844 7689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498850 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498855 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498859 7689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498864 7689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498870 7689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498876 7689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498889 7689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498894 7689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498901 7689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498907 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498912 7689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498919 7689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498924 7689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498929 7689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498934 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:14:26.507877 master-0 kubenswrapper[7689]: W0307 21:14:26.498939 7689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498944 7689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498949 7689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498953 7689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498958 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498963 7689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498967 7689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498972 7689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498978 7689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498982 7689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498987 7689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498992 7689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.498998 7689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499004 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499010 7689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499015 7689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499021 7689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499027 7689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499031 7689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499037 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:14:26.508608 master-0 kubenswrapper[7689]: W0307 21:14:26.499042 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.499046 7689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.499051 7689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.499058 7689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: I0307 21:14:26.499069 7689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: I0307 21:14:26.508037 7689 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: I0307 21:14:26.508098 7689 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508220 7689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508229 7689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508235 7689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508242 7689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508248 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508254 7689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508259 7689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508265 7689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:14:26.509741 master-0 kubenswrapper[7689]: W0307 21:14:26.508270 7689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508275 7689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508283 7689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508295 7689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508309 7689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508317 7689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508324 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508330 7689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508336 7689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508342 7689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508348 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508355 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508361 7689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508368 7689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508374 7689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508380 7689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508386 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508392 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508398 7689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:14:26.510364 master-0 kubenswrapper[7689]: W0307 21:14:26.508405 7689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508414 7689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508427 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508435 7689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508442 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508448 7689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508455 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508465 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508471 7689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508477 7689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508483 7689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508490 7689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508497 7689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508503 7689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508509 7689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508516 7689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508523 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508530 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508537 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:14:26.511215 master-0 kubenswrapper[7689]: W0307 21:14:26.508546 7689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508554 7689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508561 7689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508568 7689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508574 7689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508581 7689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508588 7689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508593 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508600 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508606 7689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508614 7689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508621 7689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508630 7689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508637 7689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508645 7689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508653 7689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508659 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508668 7689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508707 7689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508717 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:14:26.512057 master-0 kubenswrapper[7689]: W0307 21:14:26.508724 7689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508730 7689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508738 7689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508745 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508754 7689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508762 7689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: I0307 21:14:26.508774 7689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508986 7689 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.508999 7689 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509008 7689 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509016 7689 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509022 7689 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509029 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509036 7689 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509042 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:14:26.512951 master-0 kubenswrapper[7689]: W0307 21:14:26.509049 7689 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509056 7689 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509063 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509070 7689 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509076 7689 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509083 7689 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509089 7689 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509096 7689 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509102 7689 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509109 7689 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509118 7689 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509125 7689 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509135 7689 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509143 7689 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509150 7689 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509157 7689 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509166 7689 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509174 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509182 7689 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509189 7689 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:14:26.514316 master-0 kubenswrapper[7689]: W0307 21:14:26.509197 7689 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509203 7689 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509212 7689 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509221 7689 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509228 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509237 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509244 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509251 7689 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509258 7689 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509265 7689 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509272 7689 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509279 7689 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509286 7689 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509292 7689 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509300 7689 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509306 7689 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509314 7689 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509320 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509327 7689 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509334 7689 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:14:26.514848 master-0 kubenswrapper[7689]: W0307 21:14:26.509340 7689 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509347 7689 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509355 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509363 7689 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509372 7689 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509381 7689 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509390 7689 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509398 7689 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509406 7689 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509411 7689 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509418 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509424 7689 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509430 7689 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509436 7689 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509515 7689 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509522 7689 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509528 7689 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509534 7689 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509540 7689 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:14:26.515570 master-0 kubenswrapper[7689]: W0307 21:14:26.509545 7689 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: W0307 21:14:26.509551 7689 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: W0307 21:14:26.509558 7689 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: W0307 21:14:26.509564 7689 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: W0307 21:14:26.509570 7689 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.509579 7689 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.509855 7689 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512166 7689 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512275 7689 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512593 7689 server.go:997] "Starting client certificate rotation" Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512607 7689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512790 7689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 17:34:46.761757894 +0000 UTC Mar 07 21:14:26.516125 master-0 kubenswrapper[7689]: I0307 21:14:26.512940 7689 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 20h20m20.248820333s for next certificate rotation Mar 07 21:14:26.516766 master-0 kubenswrapper[7689]: I0307 21:14:26.513760 7689 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:14:26.516766 master-0 kubenswrapper[7689]: I0307 21:14:26.516722 7689 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:14:26.520069 master-0 kubenswrapper[7689]: I0307 21:14:26.520035 7689 log.go:25] "Validated CRI v1 runtime API" Mar 07 21:14:26.522067 master-0 kubenswrapper[7689]: I0307 21:14:26.522036 7689 log.go:25] "Validated CRI v1 image API" Mar 07 21:14:26.523041 master-0 kubenswrapper[7689]: I0307 21:14:26.523010 7689 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 21:14:26.526126 master-0 kubenswrapper[7689]: I0307 21:14:26.526089 7689 fs.go:135] Filesystem UUIDs: map[424f727a-1c86-4a89-859c-7d0acaca7766:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 07 21:14:26.526353 master-0 kubenswrapper[7689]: I0307 21:14:26.526114 7689 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838/userdata/shm major:0 minor:58 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm major:0 minor:238 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm major:0 minor:242 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8/userdata/shm major:0 minor:50 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm major:0 minor:44 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm major:0 minor:148 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm:{mountpoint:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps:{mountpoint:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44:{mountpoint:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf:{mountpoint:/var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b/volumes/kubernetes.io~projected/kube-api-access major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st:{mountpoint:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l:{mountpoint:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb:{mountpoint:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz:{mountpoint:/var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v:{mountpoint:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq:{mountpoint:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs:{mountpoint:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj:{mountpoint:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8:{mountpoint:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8 major:0 minor:265 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p:{mountpoint:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk:{mountpoint:/var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access major:0 minor:266 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck:{mountpoint:/var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr:{mountpoint:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr:{mountpoint:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6:{mountpoint:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs:{mountpoint:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm:{mountpoint:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv:{mountpoint:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg:{mountpoint:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr:{mountpoint:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff:{mountpoint:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw:{mountpoint:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/102e4f5d0feaf57a9b6984baf9484000d8cd15c04f217b66325dda777197b743/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/7533d1363ce56a324cb4a67a5d319d1d4669201bf1675e6db220fb965a9f44b7/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-108:{mountpoint:/var/lib/containers/storage/overlay/63e9a2ea6b7852aa23430261dd3c454f42bb1a11fea9e6c597e0fff729fdb56e/merged major:0 minor:108 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/23a93820ca1690317702d4e6c70506039819ca9d2ddf2cbaeba949ff0bb862c1/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/489bfcd3569f9d8187ecc95eb7f291d5ac20dc52a2bee189664b83584ba05317/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/cdf5726ac9bbc23ba841d56213cc28b0a3c6c73db9f753757f2ddf2507aa3d0c/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/a6072901ca010a047d748840c2b3572d1d16bfcaae0373f1d1ac3ad7c0035e19/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/71c9736e4bc5ca9f88dad6c00409fd6cb150868a4bfe067bc45fbf88f9ae00b5/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/d42fc86a22fc4fb80db19f70f2a52fcdc596dc983441a58a418ef7c844ea8a02/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/e75fb4d94971874a2a28fbdc327c9ee9baf30584f0c8a37325f6069e861b5d7b/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/fa93ee114fa8cc99ab6c119235c7edc7592d9999956606ce78922f44d8212c17/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-154:{mountpoint:/var/lib/containers/storage/overlay/d8addd6e85750007bdcc6fdee0db2abbda33dd776975b1770d8941a0bb1792de/merged major:0 minor:154 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/6df502a15cf1f57796b1cc9bc6c2216153be29b6fa6456d93e4ffbf60c17d21a/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/424f3fc04f26f7ea36e8b8c6eb8150f93dc0e8cbe7c2c80ed2d46d0fb263f3e0/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/ed464e70c1c924f1e7f418decf4391de37859617ea5e2d14c677ffce3ce0f7f6/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/6d1e48ace8503297b6ccd188d9d7eea008cdab3061a399302ba39c4a45019b6b/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/9edd7bb7bcccf00dcbef38547e8467759a006a6d690ea56e91f434ad672f81f4/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/42358ee0c1f2587e1ed52ec295e4ee8d5455f1204b421f55778e08d8b811eec6/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/0b56162755b431c428f9e23caa8457b3bf48ddd32c73320b80f25e30265d5d89/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/ea5babb320eef546678135f8a74f981b4f4840c715901e430f4cf080676bc6ba/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/2de607e7a5c24db1e8008d42d38fe99b20f7d92932313596b520cdbacd1f7571/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/ed045d65e49d7809738b16be32503d6faf143c67e03339a480253d4a66dbde84/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-270:{mountpoint:/var/lib/containers/storage/overlay/b290cf55ff09c4011fffa798f55fbe574b217cc18a5ab109a63dcebad4af856c/merged major:0 minor:270 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/1c812f33eb9e0c2a0660f674ec01ba46cdd4ed2df9fc5a0ef9e75f6ba5e6638f/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/90fd51772ea8b7b709d941265f653789d20161a1a381923b7c7c5b55d906f3e8/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/8922805a93a92aac6242ff187acac2482a88e6dd56553c0afbc79438dc725fc8/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-284:{mountpoint:/var/lib/containers/storage/overlay/53850e03ac3f8d571cfa010b30d7c7f3ab9a1c3d91a75f45b48b15a7bde22e5c/merged major:0 minor:284 fsType:overlay blockSize:0} overlay_0-286:{mountpoint:/var/lib/containers/storage/overlay/de59684b638b249a113e3d5b0a0e8aaeb24e8c656a899ffeafc574a95e26d8f3/merged major:0 minor:286 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/2b889bc5bb607afdd0d5186e24d2ca769fc69f4bda82ae98a23ecd79e9d704f9/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/fd8de545152e072078cfae3a6187ddd6e862c45f4e95517cc01b13fc8f455bfc/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/2edb4ba0341ad5bbfe8288f4ec3ec6cb90c4be0fee73b247556b92dd6b610774/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/55eecc794b0f48d3a787866a729273a0781315507f097db0da825a80f9349ab0/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-296:{mountpoint:/var/lib/containers/storage/overlay/6f5eba7d48e9ee11fb19e6ed56ebb69cec842347c77a23212d5c77e84bd34e8a/merged major:0 minor:296 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/76efd0faae8987d9626fbdac9839331c62b9455c87575dbfd8039f71ac81ff16/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/bd89bf70457c0752bb085ec0c8c9e4914cb3f1ed0c242461a9211154f929b0f6/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/07b3f35b86a00481ee25a8d97659f34aa0cc7552a132ba6dd4515e647adff6d6/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-45:{mountpoint:/var/lib/containers/storage/overlay/758caadcaa1cfa8e74286612cdd57a6b939480aafc06d6e35dd5fd2f5208bbb4/merged major:0 minor:45 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/0ebfffe5350972289d1771df2487dd0f590848f7ef66c5a6462895eb05db8bbc/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/6e8dd14d2992565f4fb5cee65cb0ae8c08c72b22eace7fe467b55998edc4572d/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/109aee8e44bb77337a04ad611ec2fb23ebd5ba3d7c33da6649d49925374fdb9f/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-60:{mountpoint:/var/lib/containers/storage/overlay/45303f18932f1b0863a87f1060e9f6d3b509b8f2a63761ee1d37993ed6a34300/merged major:0 minor:60 fsType:overlay blockSize:0} overlay_0-62:{mountpoint:/var/lib/containers/storage/overlay/f50ca587d4e42bcc28290c46930dff0b10f04209d0642674e91378d68909473b/merged major:0 minor:62 fsType:overlay blockSize:0} overlay_0-64:{mountpoint:/var/lib/containers/storage/overlay/2b41228ddb93abc2c62a91097bbd5d9bf2ea90b17c39c77d1439bc23df8028cc/merged major:0 minor:64 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/fecac9d5b9a3a2c68b5e3ca9c95017a6c45bd41e7573c0d475cd70e779d4bbc9/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-74:{mountpoint:/var/lib/containers/storage/overlay/3b512a2fada9bb72a9a751d199a9bdb86c68a83adf918f5a82fb46bf1e5a7128/merged major:0 minor:74 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/16bde1e67c84c81b1adba8440d6dc1e2ad6a1eef154b9dab3d2b8f582f71d6fa/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-84:{mountpoint:/var/lib/containers/storage/overlay/8243b6fc3085df2e24e19872332aa8302ff9e1a51e12c7cc4fa94c0c1fa2c674/merged major:0 minor:84 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/ae5b6dd1c19d3c86ab45def93e171194e208830f00e775a62a046552de68bd72/merged major:0 minor:86 fsType:overlay blockSize:0}] Mar 07 21:14:26.589484 master-0 kubenswrapper[7689]: I0307 21:14:26.588259 7689 manager.go:217] Machine: {Timestamp:2026-03-07 21:14:26.585961384 +0000 UTC m=+0.138288346 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:fd388b0a7ee840b7a9a8619058f28513 SystemUUID:fd388b0a-7ee8-40b7-a9a8-619058f28513 BootID:1e0d9bad-17ce-4467-8d98-7b297ec5d412 Filesystems:[{Device:/run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs DeviceMajor:0 DeviceMinor:249 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p DeviceMajor:0 DeviceMinor:247 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps DeviceMajor:0 DeviceMinor:141 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-45 DeviceMajor:0 DeviceMinor:45 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff DeviceMajor:0 DeviceMinor:227 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs DeviceMajor:0 DeviceMinor:229 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v DeviceMajor:0 DeviceMinor:241 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44 DeviceMajor:0 DeviceMinor:246 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:226 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv DeviceMajor:0 DeviceMinor:228 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-84 DeviceMajor:0 DeviceMinor:84 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-270 DeviceMajor:0 DeviceMinor:270 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6 DeviceMajor:0 DeviceMinor:123 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq DeviceMajor:0 DeviceMinor:225 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm DeviceMajor:0 DeviceMinor:234 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838/userdata/shm DeviceMajor:0 DeviceMinor:58 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-74 DeviceMajor:0 DeviceMinor:74 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz DeviceMajor:0 DeviceMinor:267 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st DeviceMajor:0 DeviceMinor:125 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb DeviceMajor:0 DeviceMinor:127 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-62 DeviceMajor:0 DeviceMinor:62 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-108 DeviceMajor:0 DeviceMinor:108 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm DeviceMajor:0 DeviceMinor:235 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:266 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr DeviceMajor:0 DeviceMinor:240 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr DeviceMajor:0 DeviceMinor:223 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8 DeviceMajor:0 DeviceMinor:265 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck DeviceMajor:0 DeviceMinor:99 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-60 DeviceMajor:0 DeviceMinor:60 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr DeviceMajor:0 DeviceMinor:233 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-296 DeviceMajor:0 DeviceMinor:296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf DeviceMajor:0 DeviceMinor:115 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l DeviceMajor:0 DeviceMinor:224 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm DeviceMajor:0 DeviceMinor:148 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg DeviceMajor:0 DeviceMinor:94 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-154 DeviceMajor:0 DeviceMinor:154 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj DeviceMajor:0 DeviceMinor:254 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-286 DeviceMajor:0 DeviceMinor:286 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:232 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm DeviceMajor:0 DeviceMinor:238 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:98 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-64 DeviceMajor:0 DeviceMinor:64 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:230 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh DeviceMajor:0 DeviceMinor:231 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw DeviceMajor:0 DeviceMinor:251 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm DeviceMajor:0 DeviceMinor:44 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:140 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb DeviceMajor:0 DeviceMinor:257 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk DeviceMajor:0 DeviceMinor:260 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-284 DeviceMajor:0 DeviceMinor:284 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm DeviceMajor:0 DeviceMinor:242 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8/userdata/shm DeviceMajor:0 DeviceMinor:50 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0855fa1274661b8 MacAddress:aa:f3:26:91:81:33 Speed:10000 Mtu:1350} {Name:08b2cad01a6764d MacAddress:e2:a6:ee:16:32:b3 Speed:10000 Mtu:1350} {Name:41f511d18c601df MacAddress:36:aa:cf:78:09:d0 Speed:10000 Mtu:1350} {Name:831064ad1991235 MacAddress:26:4c:e2:ea:57:99 Speed:10000 Mtu:1350} {Name:8b09916c2044187 MacAddress:92:93:f6:b3:67:43 Speed:10000 Mtu:1350} {Name:8ec6f338d22c639 MacAddress:b6:5e:82:b8:70:be Speed:10000 Mtu:1350} {Name:90d94cc33aea936 MacAddress:aa:18:2c:a8:b1:67 Speed:10000 Mtu:1350} {Name:aa4738248c68a5f MacAddress:42:b6:8a:0d:ee:bf Speed:10000 Mtu:1350} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:1450} {Name:br-int MacAddress:aa:bb:22:b4:42:aa Speed:0 Mtu:1350} {Name:cd1527a85e67a94 MacAddress:1e:12:06:a3:b7:cb Speed:10000 Mtu:1350} {Name:d8f5f93a07e9343 MacAddress:52:d1:a0:fc:aa:8d Speed:10000 Mtu:1350} {Name:e946a5469a45f45 MacAddress:0e:4d:d7:9c:44:d1 Speed:10000 Mtu:1350} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:1450} {Name:eth1 MacAddress:fa:16:3e:6e:e9:7d Speed:-1 Mtu:1450} {Name:eth2 MacAddress:fa:16:3e:38:b1:02 Speed:-1 Mtu:1450} {Name:f8d1302e8231065 MacAddress:22:6a:0f:2d:3e:20 Speed:10000 Mtu:1350} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:1350} {Name:ovs-system MacAddress:e6:01:79:57:35:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 21:14:26.589484 master-0 kubenswrapper[7689]: I0307 21:14:26.589453 7689 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 21:14:26.590142 master-0 kubenswrapper[7689]: I0307 21:14:26.589774 7689 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 21:14:26.590432 master-0 kubenswrapper[7689]: I0307 21:14:26.590374 7689 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 21:14:26.590651 master-0 kubenswrapper[7689]: I0307 21:14:26.590581 7689 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 21:14:26.590932 master-0 kubenswrapper[7689]: I0307 21:14:26.590633 7689 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 21:14:26.591003 master-0 kubenswrapper[7689]: I0307 21:14:26.590956 7689 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 21:14:26.591003 master-0 kubenswrapper[7689]: I0307 21:14:26.590973 7689 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 21:14:26.591003 master-0 kubenswrapper[7689]: I0307 21:14:26.590986 7689 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:14:26.591111 master-0 kubenswrapper[7689]: I0307 21:14:26.591018 7689 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:14:26.591417 master-0 kubenswrapper[7689]: I0307 21:14:26.591381 7689 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:14:26.591556 master-0 kubenswrapper[7689]: I0307 21:14:26.591506 7689 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 21:14:26.591601 master-0 kubenswrapper[7689]: I0307 21:14:26.591586 7689 kubelet.go:418] "Attempting to sync node with API server" Mar 07 21:14:26.591644 master-0 kubenswrapper[7689]: I0307 21:14:26.591602 7689 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 21:14:26.591644 master-0 kubenswrapper[7689]: I0307 21:14:26.591623 7689 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 21:14:26.591644 master-0 kubenswrapper[7689]: I0307 21:14:26.591637 7689 kubelet.go:324] "Adding apiserver pod source" Mar 07 21:14:26.591774 master-0 kubenswrapper[7689]: I0307 21:14:26.591659 7689 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 21:14:26.593666 master-0 kubenswrapper[7689]: I0307 21:14:26.593622 7689 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 07 21:14:26.593875 master-0 kubenswrapper[7689]: I0307 21:14:26.593845 7689 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 07 21:14:26.594181 master-0 kubenswrapper[7689]: I0307 21:14:26.594148 7689 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594276 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594300 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594308 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594315 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594322 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594331 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 21:14:26.594329 master-0 kubenswrapper[7689]: I0307 21:14:26.594339 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594348 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594358 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594366 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594377 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594391 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 21:14:26.594571 master-0 kubenswrapper[7689]: I0307 21:14:26.594419 7689 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 21:14:26.594798 master-0 kubenswrapper[7689]: I0307 21:14:26.594772 7689 server.go:1280] "Started kubelet" Mar 07 21:14:26.596165 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 07 21:14:26.599564 master-0 kubenswrapper[7689]: I0307 21:14:26.596600 7689 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 21:14:26.599564 master-0 kubenswrapper[7689]: I0307 21:14:26.597493 7689 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 21:14:26.599564 master-0 kubenswrapper[7689]: I0307 21:14:26.597767 7689 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 07 21:14:26.604640 master-0 kubenswrapper[7689]: I0307 21:14:26.598477 7689 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 21:14:26.615202 master-0 kubenswrapper[7689]: I0307 21:14:26.615129 7689 server.go:449] "Adding debug handlers to kubelet server" Mar 07 21:14:26.619951 master-0 kubenswrapper[7689]: I0307 21:14:26.619900 7689 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 21:14:26.620020 master-0 kubenswrapper[7689]: I0307 21:14:26.619941 7689 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 21:14:26.624453 master-0 kubenswrapper[7689]: I0307 21:14:26.624362 7689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 21:14:26.624453 master-0 kubenswrapper[7689]: I0307 21:14:26.624438 7689 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 21:14:26.624782 master-0 kubenswrapper[7689]: I0307 21:14:26.624512 7689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 18:26:59.055757528 +0000 UTC Mar 07 21:14:26.624782 master-0 kubenswrapper[7689]: I0307 21:14:26.624592 7689 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 21h12m32.431168558s for next certificate rotation Mar 07 21:14:26.624782 master-0 kubenswrapper[7689]: I0307 21:14:26.624613 7689 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 21:14:26.624782 master-0 kubenswrapper[7689]: I0307 21:14:26.624645 7689 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 21:14:26.624945 master-0 kubenswrapper[7689]: I0307 21:14:26.624796 7689 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 07 21:14:26.625643 master-0 kubenswrapper[7689]: I0307 21:14:26.625601 7689 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 21:14:26.625643 master-0 kubenswrapper[7689]: I0307 21:14:26.625629 7689 factory.go:55] Registering systemd factory Mar 07 21:14:26.625643 master-0 kubenswrapper[7689]: I0307 21:14:26.625639 7689 factory.go:221] Registration of the systemd container factory successfully Mar 07 21:14:26.626595 master-0 kubenswrapper[7689]: I0307 21:14:26.626513 7689 factory.go:153] Registering CRI-O factory Mar 07 21:14:26.626595 master-0 kubenswrapper[7689]: I0307 21:14:26.626563 7689 factory.go:221] Registration of the crio container factory successfully Mar 07 21:14:26.626595 master-0 kubenswrapper[7689]: I0307 21:14:26.626601 7689 factory.go:103] Registering Raw factory Mar 07 21:14:26.627596 master-0 kubenswrapper[7689]: I0307 21:14:26.626618 7689 manager.go:1196] Started watching for new ooms in manager Mar 07 21:14:26.627596 master-0 kubenswrapper[7689]: I0307 21:14:26.627273 7689 manager.go:319] Starting recovery of all containers Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634154 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8980370-267c-4168-ba97-d780698533ff" volumeName="kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634260 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc392945-53ad-473c-8803-70e2026712d2" volumeName="kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634285 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" volumeName="kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634304 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666475e5-df4b-44ef-a2d4-39d84ab91aad" volumeName="kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634329 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634347 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634363 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d64cd1-bd5b-4fbc-972b-000a03c854fe" volumeName="kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634381 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634405 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634433 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634453 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634480 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634501 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634544 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634569 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634602 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634624 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634645 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634672 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634720 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634750 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634765 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8980370-267c-4168-ba97-d780698533ff" volumeName="kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634783 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634807 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" volumeName="kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634825 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634849 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634871 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634895 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634913 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634933 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634950 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634973 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.634988 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635003 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635021 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635035 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635059 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635080 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635102 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635181 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635255 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" volumeName="kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635271 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d64cd1-bd5b-4fbc-972b-000a03c854fe" volumeName="kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635299 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" volumeName="kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635318 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635340 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666475e5-df4b-44ef-a2d4-39d84ab91aad" volumeName="kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635355 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635372 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635394 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635418 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635440 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635455 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635471 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635504 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635531 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635549 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635571 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61a9fce6-50e1-413c-9ec0-177d6e903bdd" volumeName="kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635592 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982319eb-2dc2-4faa-85d8-ee11840179fd" volumeName="kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635615 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635640 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635659 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635676 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635732 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635753 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635828 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635874 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635943 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635964 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.635982 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636003 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636018 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636046 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636063 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636080 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc392945-53ad-473c-8803-70e2026712d2" volumeName="kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636104 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636121 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636144 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636165 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636181 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636208 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636230 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69851821-e1fc-44a8-98df-0cfe9d564126" volumeName="kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636257 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636279 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e720291b-0f96-4ebb-80f2-5df7cb194ffc" volumeName="kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636298 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636329 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636346 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab2f6566-730d-46f5-92ed-79e3039d24e8" volumeName="kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636368 7689 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr" seLinuxMountContext="" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636383 7689 reconstruct.go:97] "Volume reconstruction finished" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636393 7689 reconciler.go:26] "Reconciler: start to sync state" Mar 07 21:14:26.637818 master-0 kubenswrapper[7689]: I0307 21:14:26.636767 7689 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 21:14:26.680724 master-0 kubenswrapper[7689]: I0307 21:14:26.679505 7689 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 21:14:26.682402 master-0 kubenswrapper[7689]: I0307 21:14:26.682367 7689 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 21:14:26.682506 master-0 kubenswrapper[7689]: I0307 21:14:26.682437 7689 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 21:14:26.682506 master-0 kubenswrapper[7689]: I0307 21:14:26.682474 7689 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 21:14:26.682603 master-0 kubenswrapper[7689]: E0307 21:14:26.682551 7689 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 07 21:14:26.685572 master-0 kubenswrapper[7689]: I0307 21:14:26.685540 7689 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 21:14:26.689805 master-0 kubenswrapper[7689]: I0307 21:14:26.688783 7689 generic.go:334] "Generic (PLEG): container finished" podID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerID="5d8a696d04df358a26bc157288f94a3ff4652e100c1ed368a8504d7b4df97ebb" exitCode=0 Mar 07 21:14:26.760820 master-0 kubenswrapper[7689]: I0307 21:14:26.760719 7689 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 07 21:14:26.775195 master-0 kubenswrapper[7689]: I0307 21:14:26.774925 7689 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779645 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="8fa7422d23bcb03f45ab2c3bec3ea5e6214caa8f28b047daca9c932d4eca1830" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779701 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="9574294e43be3e31b38cb24910c3f9a3961ac6d7fef3d8e88cef73fac06c22e3" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779709 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="22f599619e79420fd9506a4f183f60ba821b3ac500c2322da39e388d594122e4" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779718 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="a27fb1b48c1a71a257bd1f0e26afd03b783b613cf862675ed38b35ffc09792a8" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779725 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="1204b0d0bd4ef37ca4508ca7c0bfef9f1e850dc26e2ddde2b7523df8be7455e3" exitCode=0 Mar 07 21:14:26.779755 master-0 kubenswrapper[7689]: I0307 21:14:26.779734 7689 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="7321d4d6e798cb535bde4f9b51f6814dd5e6706005dac86d4315f2c88fc7fa27" exitCode=0 Mar 07 21:14:26.782088 master-0 kubenswrapper[7689]: I0307 21:14:26.782065 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 07 21:14:26.782658 master-0 kubenswrapper[7689]: I0307 21:14:26.782633 7689 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" exitCode=1 Mar 07 21:14:26.782770 master-0 kubenswrapper[7689]: I0307 21:14:26.782756 7689 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248" exitCode=0 Mar 07 21:14:26.782882 master-0 kubenswrapper[7689]: E0307 21:14:26.782655 7689 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 07 21:14:26.798153 master-0 kubenswrapper[7689]: I0307 21:14:26.798095 7689 generic.go:334] "Generic (PLEG): container finished" podID="420c6d8f-6313-4d6c-b817-420797fc6878" containerID="89e83b02510db448aa7211c7a69aa7fdf926031ee29094a8ecb9aeeb18ccc925" exitCode=0 Mar 07 21:14:26.800354 master-0 kubenswrapper[7689]: I0307 21:14:26.800308 7689 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a" exitCode=1 Mar 07 21:14:26.804448 master-0 kubenswrapper[7689]: I0307 21:14:26.804384 7689 generic.go:334] "Generic (PLEG): container finished" podID="2d827a93-49e5-4694-b119-957cfa9bd648" containerID="485cabca7a9edbb9a83d8ef9ee43891f8c296cb8958998f7a4fa97d4fc8e25c3" exitCode=0 Mar 07 21:14:26.856837 master-0 kubenswrapper[7689]: I0307 21:14:26.856720 7689 manager.go:324] Recovery completed Mar 07 21:14:26.916102 master-0 kubenswrapper[7689]: I0307 21:14:26.915947 7689 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 21:14:26.916102 master-0 kubenswrapper[7689]: I0307 21:14:26.916004 7689 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 21:14:26.916102 master-0 kubenswrapper[7689]: I0307 21:14:26.916042 7689 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:14:26.916407 master-0 kubenswrapper[7689]: I0307 21:14:26.916361 7689 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 07 21:14:26.916463 master-0 kubenswrapper[7689]: I0307 21:14:26.916396 7689 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 07 21:14:26.916463 master-0 kubenswrapper[7689]: I0307 21:14:26.916439 7689 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 07 21:14:26.916463 master-0 kubenswrapper[7689]: I0307 21:14:26.916454 7689 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 07 21:14:26.916585 master-0 kubenswrapper[7689]: I0307 21:14:26.916469 7689 policy_none.go:49] "None policy: Start" Mar 07 21:14:26.923891 master-0 kubenswrapper[7689]: I0307 21:14:26.923836 7689 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 21:14:26.923963 master-0 kubenswrapper[7689]: I0307 21:14:26.923915 7689 state_mem.go:35] "Initializing new in-memory state store" Mar 07 21:14:26.924329 master-0 kubenswrapper[7689]: I0307 21:14:26.924296 7689 state_mem.go:75] "Updated machine memory state" Mar 07 21:14:26.924329 master-0 kubenswrapper[7689]: I0307 21:14:26.924316 7689 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 07 21:14:26.935696 master-0 kubenswrapper[7689]: I0307 21:14:26.935630 7689 manager.go:334] "Starting Device Plugin manager" Mar 07 21:14:26.937116 master-0 kubenswrapper[7689]: I0307 21:14:26.937058 7689 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 21:14:26.937116 master-0 kubenswrapper[7689]: I0307 21:14:26.937112 7689 server.go:79] "Starting device plugin registration server" Mar 07 21:14:26.941830 master-0 kubenswrapper[7689]: I0307 21:14:26.939921 7689 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 21:14:26.941886 master-0 kubenswrapper[7689]: I0307 21:14:26.941825 7689 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 21:14:26.942348 master-0 kubenswrapper[7689]: I0307 21:14:26.942306 7689 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 21:14:26.942467 master-0 kubenswrapper[7689]: I0307 21:14:26.942431 7689 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 21:14:26.942467 master-0 kubenswrapper[7689]: I0307 21:14:26.942455 7689 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 21:14:26.983068 master-0 kubenswrapper[7689]: I0307 21:14:26.983012 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a" Mar 07 21:14:26.983068 master-0 kubenswrapper[7689]: I0307 21:14:26.983052 7689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0"] Mar 07 21:14:26.983490 master-0 kubenswrapper[7689]: I0307 21:14:26.983412 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe"} Mar 07 21:14:26.983490 master-0 kubenswrapper[7689]: I0307 21:14:26.983488 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983500 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0-master-0" event={"ID":"354f29997baa583b6238f7de9108ee10","Type":"ContainerStarted","Data":"a4d69998b458628175c09aa6eead6ce76a76afbfab0f85e583b7bc54795e93e8"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983512 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983523 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983536 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="86843bb1afb42dfa39abd9f6396ee672969a397644d2eb40da43bc284d9135db" Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983549 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983558 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983568 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerDied","Data":"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71"} Mar 07 21:14:26.983588 master-0 kubenswrapper[7689]: I0307 21:14:26.983579 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" event={"ID":"5f77c8e18b751d90bc0dfe2d4e304050","Type":"ContainerStarted","Data":"5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983602 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a056ccba22060bdb53ac003460ab1c7bac5832040445f86cf7efe33efd5a3ab2"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983614 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983627 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983637 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983668 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983694 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983705 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983726 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff"} Mar 07 21:14:26.984035 master-0 kubenswrapper[7689]: I0307 21:14:26.983749 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c" Mar 07 21:14:27.000194 master-0 kubenswrapper[7689]: E0307 21:14:27.000142 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.000365 master-0 kubenswrapper[7689]: E0307 21:14:27.000150 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.000472 master-0 kubenswrapper[7689]: E0307 21:14:27.000428 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.000644 master-0 kubenswrapper[7689]: E0307 21:14:27.000366 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.000782 master-0 kubenswrapper[7689]: W0307 21:14:27.000310 7689 warnings.go:70] would violate PodSecurity "restricted:latest": host namespaces (hostNetwork=true), hostPort (container "etcd" uses hostPorts 2379, 2380), privileged (containers "etcdctl", "etcd" must not set securityContext.privileged=true), allowPrivilegeEscalation != false (containers "etcdctl", "etcd" must set securityContext.allowPrivilegeEscalation=false), unrestricted capabilities (containers "etcdctl", "etcd" must set securityContext.capabilities.drop=["ALL"]), restricted volume types (volumes "certs", "data-dir" use restricted volume type "hostPath"), runAsNonRoot != true (pod or containers "etcdctl", "etcd" must set securityContext.runAsNonRoot=true), seccompProfile (pod or containers "etcdctl", "etcd" must set securityContext.seccompProfile.type to "RuntimeDefault" or "Localhost") Mar 07 21:14:27.000782 master-0 kubenswrapper[7689]: E0307 21:14:27.000782 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0-master-0\" already exists" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.043062 master-0 kubenswrapper[7689]: I0307 21:14:27.042995 7689 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:14:27.045818 master-0 kubenswrapper[7689]: I0307 21:14:27.045775 7689 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:14:27.045818 master-0 kubenswrapper[7689]: I0307 21:14:27.045823 7689 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:14:27.045946 master-0 kubenswrapper[7689]: I0307 21:14:27.045833 7689 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:14:27.045946 master-0 kubenswrapper[7689]: I0307 21:14:27.045917 7689 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:14:27.061226 master-0 kubenswrapper[7689]: I0307 21:14:27.061161 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061232 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061259 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061276 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061294 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061317 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061334 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061376 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.061402 master-0 kubenswrapper[7689]: I0307 21:14:27.061392 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.061757 master-0 kubenswrapper[7689]: I0307 21:14:27.061411 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.062256 master-0 kubenswrapper[7689]: I0307 21:14:27.062220 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.062338 master-0 kubenswrapper[7689]: I0307 21:14:27.062292 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.062338 master-0 kubenswrapper[7689]: I0307 21:14:27.062325 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.062420 master-0 kubenswrapper[7689]: I0307 21:14:27.062348 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.062420 master-0 kubenswrapper[7689]: I0307 21:14:27.062356 7689 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 07 21:14:27.062420 master-0 kubenswrapper[7689]: I0307 21:14:27.062365 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.062420 master-0 kubenswrapper[7689]: I0307 21:14:27.062385 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.062420 master-0 kubenswrapper[7689]: I0307 21:14:27.062407 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.062773 master-0 kubenswrapper[7689]: I0307 21:14:27.062735 7689 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 07 21:14:27.163666 master-0 kubenswrapper[7689]: I0307 21:14:27.163579 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.163666 master-0 kubenswrapper[7689]: I0307 21:14:27.163650 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.163666 master-0 kubenswrapper[7689]: I0307 21:14:27.163697 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.163852 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.163907 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.163979 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.163980 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.164008 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.164015 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.164042 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.164115 master-0 kubenswrapper[7689]: I0307 21:14:27.164106 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164329 master-0 kubenswrapper[7689]: I0307 21:14:27.164155 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164329 master-0 kubenswrapper[7689]: I0307 21:14:27.164231 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164329 master-0 kubenswrapper[7689]: I0307 21:14:27.164242 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164329 master-0 kubenswrapper[7689]: I0307 21:14:27.164263 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.164329 master-0 kubenswrapper[7689]: I0307 21:14:27.164330 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.164481 master-0 kubenswrapper[7689]: I0307 21:14:27.164354 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.164481 master-0 kubenswrapper[7689]: I0307 21:14:27.164400 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.164481 master-0 kubenswrapper[7689]: I0307 21:14:27.164424 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164481 master-0 kubenswrapper[7689]: I0307 21:14:27.164481 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164492 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164495 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164516 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164521 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164540 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164564 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"etcd-master-0-master-0\" (UID: \"354f29997baa583b6238f7de9108ee10\") " pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:14:27.164600 master-0 kubenswrapper[7689]: I0307 21:14:27.164586 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164617 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164618 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164645 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164648 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164666 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"bootstrap-kube-apiserver-master-0\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164717 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.164840 master-0 kubenswrapper[7689]: I0307 21:14:27.164757 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:27.592829 master-0 kubenswrapper[7689]: I0307 21:14:27.592410 7689 apiserver.go:52] "Watching apiserver" Mar 07 21:14:28.685901 master-0 kubenswrapper[7689]: I0307 21:14:28.684221 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:28.694169 master-0 kubenswrapper[7689]: I0307 21:14:28.690745 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:28.698444 master-0 kubenswrapper[7689]: I0307 21:14:28.698264 7689 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 21:14:28.701516 master-0 kubenswrapper[7689]: I0307 21:14:28.700897 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q","kube-system/bootstrap-kube-scheduler-master-0","openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h","openshift-etcd/etcd-master-0-master-0","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-multus/multus-g6nmq","openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4","openshift-ingress-operator/ingress-operator-677db989d6-tklw9","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd","openshift-multus/multus-admission-controller-8d675b596-mmqbs","openshift-network-diagnostics/network-check-target-fr4qr","openshift-network-operator/network-operator-7c649bf6d4-v4xm9","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f","openshift-network-node-identity/network-node-identity-kpsm4","openshift-network-operator/iptables-alerter-n8nz9","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz","kube-system/bootstrap-kube-controller-manager-master-0","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k","openshift-multus/network-metrics-daemon-l2bdp","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-config-operator/openshift-config-operator-64488f9d78-cb227","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b","openshift-ovn-kubernetes/ovnkube-node-x9v76","assisted-installer/assisted-installer-controller-mqwls","openshift-dns-operator/dns-operator-589895fbb7-wqqqr","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m","openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg","openshift-multus/multus-additional-cni-plugins-xf7kg","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6"] Mar 07 21:14:28.701516 master-0 kubenswrapper[7689]: I0307 21:14:28.701367 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:14:28.702006 master-0 kubenswrapper[7689]: I0307 21:14:28.701607 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:28.702006 master-0 kubenswrapper[7689]: I0307 21:14:28.701705 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.702107 master-0 kubenswrapper[7689]: I0307 21:14:28.702070 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.702153 master-0 kubenswrapper[7689]: I0307 21:14:28.702101 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:28.703672 master-0 kubenswrapper[7689]: I0307 21:14:28.702831 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.704247 master-0 kubenswrapper[7689]: I0307 21:14:28.704157 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:28.704962 master-0 kubenswrapper[7689]: I0307 21:14:28.704249 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.704962 master-0 kubenswrapper[7689]: I0307 21:14:28.704307 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:28.704962 master-0 kubenswrapper[7689]: I0307 21:14:28.704555 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:28.705443 master-0 kubenswrapper[7689]: I0307 21:14:28.705353 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.705795 master-0 kubenswrapper[7689]: I0307 21:14:28.705718 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 21:14:28.706353 master-0 kubenswrapper[7689]: I0307 21:14:28.706298 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 07 21:14:28.714216 master-0 kubenswrapper[7689]: I0307 21:14:28.712232 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 21:14:28.714216 master-0 kubenswrapper[7689]: I0307 21:14:28.713089 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:28.714216 master-0 kubenswrapper[7689]: I0307 21:14:28.713477 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:28.714216 master-0 kubenswrapper[7689]: I0307 21:14:28.713759 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:28.715792 master-0 kubenswrapper[7689]: I0307 21:14:28.715718 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:28.716171 master-0 kubenswrapper[7689]: I0307 21:14:28.716121 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 21:14:28.725590 master-0 kubenswrapper[7689]: I0307 21:14:28.725132 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 21:14:28.725895 master-0 kubenswrapper[7689]: I0307 21:14:28.725620 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 21:14:28.726223 master-0 kubenswrapper[7689]: I0307 21:14:28.726172 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.726481 master-0 kubenswrapper[7689]: I0307 21:14:28.726394 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 07 21:14:28.727781 master-0 kubenswrapper[7689]: I0307 21:14:28.727624 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 21:14:28.728053 master-0 kubenswrapper[7689]: I0307 21:14:28.727913 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 07 21:14:28.728270 master-0 kubenswrapper[7689]: I0307 21:14:28.728203 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 07 21:14:28.728419 master-0 kubenswrapper[7689]: I0307 21:14:28.728337 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 21:14:28.728583 master-0 kubenswrapper[7689]: I0307 21:14:28.728519 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:14:28.729143 master-0 kubenswrapper[7689]: I0307 21:14:28.728904 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 07 21:14:28.729143 master-0 kubenswrapper[7689]: I0307 21:14:28.728934 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.730886 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.731649 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.732254 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.732441 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.732671 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.733378 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736073 7689 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736204 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736340 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736438 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736580 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736784 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736944 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736963 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.736998 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737404 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737357 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737602 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737854 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737993 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737962 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 21:14:28.738165 master-0 kubenswrapper[7689]: I0307 21:14:28.737994 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 21:14:28.739223 master-0 kubenswrapper[7689]: I0307 21:14:28.738389 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 21:14:28.739223 master-0 kubenswrapper[7689]: I0307 21:14:28.738954 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 21:14:28.739304 master-0 kubenswrapper[7689]: I0307 21:14:28.739269 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 21:14:28.742059 master-0 kubenswrapper[7689]: I0307 21:14:28.741870 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 21:14:28.742059 master-0 kubenswrapper[7689]: I0307 21:14:28.741985 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742097 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742154 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742270 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742394 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742444 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742517 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742543 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742701 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.742965 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 21:14:28.743079 master-0 kubenswrapper[7689]: I0307 21:14:28.743072 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743477 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743540 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743651 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743618 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743807 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743957 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743991 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744080 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744094 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744111 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744254 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744271 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744315 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743706 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744475 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743740 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.743753 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744489 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.744806 master-0 kubenswrapper[7689]: I0307 21:14:28.744610 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.745038 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.745426 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.745585 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746526 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746608 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: E0307 21:14:28.746637 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746672 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746792 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746889 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.746979 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747053 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747440 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747542 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747620 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747711 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747820 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747920 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.747990 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748007 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748045 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748104 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748140 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748158 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748247 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.748294 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 21:14:28.757229 master-0 kubenswrapper[7689]: I0307 21:14:28.750252 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 21:14:28.761431 master-0 kubenswrapper[7689]: I0307 21:14:28.760233 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 21:14:28.761431 master-0 kubenswrapper[7689]: I0307 21:14:28.760401 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 21:14:28.765129 master-0 kubenswrapper[7689]: I0307 21:14:28.761646 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 21:14:28.765129 master-0 kubenswrapper[7689]: I0307 21:14:28.765105 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 21:14:28.765780 master-0 kubenswrapper[7689]: I0307 21:14:28.765732 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 21:14:28.777109 master-0 kubenswrapper[7689]: I0307 21:14:28.777016 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 07 21:14:28.779568 master-0 kubenswrapper[7689]: I0307 21:14:28.779447 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 21:14:28.782659 master-0 kubenswrapper[7689]: I0307 21:14:28.782631 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:28.782749 master-0 kubenswrapper[7689]: I0307 21:14:28.782666 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:28.782749 master-0 kubenswrapper[7689]: I0307 21:14:28.782711 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.782836 master-0 kubenswrapper[7689]: I0307 21:14:28.782781 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:28.782836 master-0 kubenswrapper[7689]: I0307 21:14:28.782806 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.783063 master-0 kubenswrapper[7689]: I0307 21:14:28.783028 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.783063 master-0 kubenswrapper[7689]: I0307 21:14:28.783056 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.783204 master-0 kubenswrapper[7689]: I0307 21:14:28.783148 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:28.783305 master-0 kubenswrapper[7689]: I0307 21:14:28.783281 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:28.783355 master-0 kubenswrapper[7689]: I0307 21:14:28.783308 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.783355 master-0 kubenswrapper[7689]: I0307 21:14:28.783241 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:28.783434 master-0 kubenswrapper[7689]: I0307 21:14:28.783397 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.783434 master-0 kubenswrapper[7689]: I0307 21:14:28.783417 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.783518 master-0 kubenswrapper[7689]: I0307 21:14:28.783437 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783550 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783580 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783609 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783631 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783650 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783651 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783672 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.783697 master-0 kubenswrapper[7689]: I0307 21:14:28.783704 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:28.783973 master-0 kubenswrapper[7689]: I0307 21:14:28.783758 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:28.783973 master-0 kubenswrapper[7689]: I0307 21:14:28.783848 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:28.784052 master-0 kubenswrapper[7689]: I0307 21:14:28.783998 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.784052 master-0 kubenswrapper[7689]: I0307 21:14:28.784003 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.784052 master-0 kubenswrapper[7689]: I0307 21:14:28.784036 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784160 master-0 kubenswrapper[7689]: I0307 21:14:28.784060 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.784160 master-0 kubenswrapper[7689]: I0307 21:14:28.784098 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.784160 master-0 kubenswrapper[7689]: I0307 21:14:28.784123 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.784160 master-0 kubenswrapper[7689]: I0307 21:14:28.784145 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.784304 master-0 kubenswrapper[7689]: I0307 21:14:28.784221 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784304 master-0 kubenswrapper[7689]: I0307 21:14:28.784241 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.784304 master-0 kubenswrapper[7689]: I0307 21:14:28.784268 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.784460 master-0 kubenswrapper[7689]: I0307 21:14:28.784307 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.784460 master-0 kubenswrapper[7689]: I0307 21:14:28.784333 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784460 master-0 kubenswrapper[7689]: I0307 21:14:28.784341 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.784460 master-0 kubenswrapper[7689]: I0307 21:14:28.784408 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:28.784460 master-0 kubenswrapper[7689]: I0307 21:14:28.784445 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784477 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784509 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784541 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784569 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784576 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784610 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:28.784641 master-0 kubenswrapper[7689]: I0307 21:14:28.784642 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784660 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784677 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784724 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784730 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784760 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784787 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784811 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784822 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784838 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784876 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784904 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.784932 master-0 kubenswrapper[7689]: I0307 21:14:28.784930 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.784957 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.784999 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785027 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785051 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785112 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785143 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785172 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785197 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785229 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785257 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785313 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785332 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785364 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785367 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.785435 master-0 kubenswrapper[7689]: I0307 21:14:28.785425 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785475 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785511 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785541 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785557 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785571 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785601 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785636 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785664 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785690 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785714 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785747 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785775 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785800 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785822 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785845 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785873 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785891 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785901 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785932 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785961 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.786020 master-0 kubenswrapper[7689]: I0307 21:14:28.785999 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786045 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786086 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786136 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786174 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786003 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786050 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786273 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786280 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786328 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786386 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786408 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786420 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786432 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786434 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786415 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786522 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786555 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786637 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786668 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786673 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786732 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786761 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786771 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:28.786788 master-0 kubenswrapper[7689]: I0307 21:14:28.786800 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.786850 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.786878 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.786929 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.786954 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.786979 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787031 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787098 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787174 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787192 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787206 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787245 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787299 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787328 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787384 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787412 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787471 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787508 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787509 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787543 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.787575 master-0 kubenswrapper[7689]: I0307 21:14:28.787592 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787615 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787643 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787643 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787662 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787700 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787722 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787743 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787764 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787782 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787806 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787808 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787858 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787879 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787939 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.787995 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788029 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788064 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788149 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788193 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788233 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788273 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:28.788285 master-0 kubenswrapper[7689]: I0307 21:14:28.788305 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788330 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788408 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788732 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788769 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788773 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788470 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788767 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788875 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788906 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788911 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788936 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788958 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.788982 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789003 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789029 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789054 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789079 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789097 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789116 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.789126 master-0 kubenswrapper[7689]: I0307 21:14:28.789138 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.789796 master-0 kubenswrapper[7689]: I0307 21:14:28.789264 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:28.789796 master-0 kubenswrapper[7689]: I0307 21:14:28.789299 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:28.789796 master-0 kubenswrapper[7689]: I0307 21:14:28.789345 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:28.789796 master-0 kubenswrapper[7689]: I0307 21:14:28.789452 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:28.789796 master-0 kubenswrapper[7689]: I0307 21:14:28.789613 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:28.792630 master-0 kubenswrapper[7689]: I0307 21:14:28.792602 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 21:14:28.812655 master-0 kubenswrapper[7689]: I0307 21:14:28.812523 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 21:14:28.818925 master-0 kubenswrapper[7689]: I0307 21:14:28.818876 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.831618 master-0 kubenswrapper[7689]: I0307 21:14:28.831573 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 21:14:28.833796 master-0 kubenswrapper[7689]: I0307 21:14:28.833735 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.853404 master-0 kubenswrapper[7689]: I0307 21:14:28.853343 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 21:14:28.853854 master-0 kubenswrapper[7689]: I0307 21:14:28.853824 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:28.873264 master-0 kubenswrapper[7689]: I0307 21:14:28.873186 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 21:14:29.135064 master-0 kubenswrapper[7689]: I0307 21:14:29.134989 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.135064 master-0 kubenswrapper[7689]: I0307 21:14:29.135052 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:29.135064 master-0 kubenswrapper[7689]: I0307 21:14:29.135083 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135128 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135150 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135176 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135213 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135246 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135286 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135310 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135341 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135368 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135409 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135441 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135461 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135484 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135500 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135531 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135577 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.135585 master-0 kubenswrapper[7689]: I0307 21:14:29.135608 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135642 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135699 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135750 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135769 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135896 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135931 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135985 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.135996 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136027 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136074 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136086 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136115 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136170 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136224 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136231 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136259 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136263 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136247 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136366 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136430 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136451 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136474 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136522 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636492889 +0000 UTC m=+3.188820011 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136548 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636534951 +0000 UTC m=+3.188861853 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136551 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136588 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636569441 +0000 UTC m=+3.188896333 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136593 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136605 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136621 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: E0307 21:14:29.136402 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:29.136598 master-0 kubenswrapper[7689]: I0307 21:14:29.136649 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136703 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.136676 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636646263 +0000 UTC m=+3.188973445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136778 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.136802 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636793388 +0000 UTC m=+3.189120560 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136831 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.136836 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.636819119 +0000 UTC m=+3.189146241 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.136777 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136878 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136914 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136941 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.136995 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137012 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137018 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137055 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.637044874 +0000 UTC m=+3.189371766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137045 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137076 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.637068205 +0000 UTC m=+3.189395097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137086 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137097 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137121 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137126 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137140 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137164 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137177 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137224 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137252 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137300 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137362 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137413 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137477 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137563 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137604 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.637590318 +0000 UTC m=+3.189917310 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137663 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: E0307 21:14:29.137734 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.637716612 +0000 UTC m=+3.190043524 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137773 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137809 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137837 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137877 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137907 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.137970 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138008 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138059 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138115 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138147 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138178 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138213 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138247 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138277 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138315 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138426 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138583 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.138699 master-0 kubenswrapper[7689]: I0307 21:14:29.138675 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.138818 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.138872 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.638850211 +0000 UTC m=+3.191177143 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.138920 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.138973 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139488 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139536 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139558 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139587 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139610 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139655 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139719 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.639672863 +0000 UTC m=+3.191999755 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139764 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139788 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.639778156 +0000 UTC m=+3.192105038 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139847 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139867 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.639859038 +0000 UTC m=+3.192185930 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139905 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: E0307 21:14:29.139923 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:29.639918239 +0000 UTC m=+3.192245131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:29.141193 master-0 kubenswrapper[7689]: I0307 21:14:29.139974 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.150468 master-0 kubenswrapper[7689]: I0307 21:14:29.150395 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 21:14:29.150938 master-0 kubenswrapper[7689]: I0307 21:14:29.150870 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 21:14:29.151211 master-0 kubenswrapper[7689]: I0307 21:14:29.151023 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 21:14:29.156250 master-0 kubenswrapper[7689]: I0307 21:14:29.155604 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.156250 master-0 kubenswrapper[7689]: I0307 21:14:29.155641 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:29.158061 master-0 kubenswrapper[7689]: I0307 21:14:29.157887 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.159017 master-0 kubenswrapper[7689]: I0307 21:14:29.158959 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:29.159760 master-0 kubenswrapper[7689]: I0307 21:14:29.159662 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:14:29.161433 master-0 kubenswrapper[7689]: I0307 21:14:29.161357 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:29.163626 master-0 kubenswrapper[7689]: I0307 21:14:29.163567 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:29.163859 master-0 kubenswrapper[7689]: I0307 21:14:29.163811 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:14:29.171200 master-0 kubenswrapper[7689]: I0307 21:14:29.171139 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:29.171342 master-0 kubenswrapper[7689]: I0307 21:14:29.171233 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:29.172087 master-0 kubenswrapper[7689]: I0307 21:14:29.171963 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:14:29.180496 master-0 kubenswrapper[7689]: I0307 21:14:29.180112 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:14:29.183233 master-0 kubenswrapper[7689]: I0307 21:14:29.183158 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.183744 master-0 kubenswrapper[7689]: I0307 21:14:29.183627 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:29.206584 master-0 kubenswrapper[7689]: I0307 21:14:29.206486 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:14:29.225554 master-0 kubenswrapper[7689]: I0307 21:14:29.225463 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:14:29.251983 master-0 kubenswrapper[7689]: I0307 21:14:29.251820 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:14:29.264716 master-0 kubenswrapper[7689]: I0307 21:14:29.264634 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:14:29.284905 master-0 kubenswrapper[7689]: I0307 21:14:29.284854 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:14:29.304964 master-0 kubenswrapper[7689]: I0307 21:14:29.304921 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:29.310794 master-0 kubenswrapper[7689]: I0307 21:14:29.310060 7689 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:14:29.328662 master-0 kubenswrapper[7689]: I0307 21:14:29.328601 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:29.345290 master-0 kubenswrapper[7689]: I0307 21:14:29.345216 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:14:29.368664 master-0 kubenswrapper[7689]: I0307 21:14:29.368621 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:14:29.388494 master-0 kubenswrapper[7689]: I0307 21:14:29.388349 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:14:29.408515 master-0 kubenswrapper[7689]: I0307 21:14:29.408458 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:29.425913 master-0 kubenswrapper[7689]: I0307 21:14:29.425828 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:14:29.444942 master-0 kubenswrapper[7689]: I0307 21:14:29.444866 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:29.466214 master-0 kubenswrapper[7689]: I0307 21:14:29.466162 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:29.487949 master-0 kubenswrapper[7689]: I0307 21:14:29.487873 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:29.506137 master-0 kubenswrapper[7689]: I0307 21:14:29.506062 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:29.527294 master-0 kubenswrapper[7689]: I0307 21:14:29.527257 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:14:29.543582 master-0 kubenswrapper[7689]: I0307 21:14:29.543513 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:29.566190 master-0 kubenswrapper[7689]: I0307 21:14:29.566132 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:14:29.587521 master-0 kubenswrapper[7689]: I0307 21:14:29.587447 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:14:29.607289 master-0 kubenswrapper[7689]: I0307 21:14:29.607246 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:29.624752 master-0 kubenswrapper[7689]: I0307 21:14:29.624700 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:14:29.644039 master-0 kubenswrapper[7689]: I0307 21:14:29.643877 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:29.649899 master-0 kubenswrapper[7689]: I0307 21:14:29.649872 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:29.650117 master-0 kubenswrapper[7689]: E0307 21:14:29.650070 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:29.650222 master-0 kubenswrapper[7689]: I0307 21:14:29.650168 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:29.650268 master-0 kubenswrapper[7689]: E0307 21:14:29.650186 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.650156818 +0000 UTC m=+4.202483720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:29.650308 master-0 kubenswrapper[7689]: I0307 21:14:29.650273 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:29.650343 master-0 kubenswrapper[7689]: I0307 21:14:29.650308 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:29.650343 master-0 kubenswrapper[7689]: E0307 21:14:29.650321 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:29.650411 master-0 kubenswrapper[7689]: I0307 21:14:29.650364 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:29.650411 master-0 kubenswrapper[7689]: E0307 21:14:29.650377 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.650362523 +0000 UTC m=+4.202689415 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:29.650675 master-0 kubenswrapper[7689]: E0307 21:14:29.650599 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:29.650839 master-0 kubenswrapper[7689]: E0307 21:14:29.650803 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.650757914 +0000 UTC m=+4.203084846 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:29.650931 master-0 kubenswrapper[7689]: E0307 21:14:29.650906 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:29.650978 master-0 kubenswrapper[7689]: E0307 21:14:29.650962 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.650944279 +0000 UTC m=+4.203271211 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:29.651058 master-0 kubenswrapper[7689]: I0307 21:14:29.651022 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:29.651129 master-0 kubenswrapper[7689]: I0307 21:14:29.651095 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:29.651203 master-0 kubenswrapper[7689]: I0307 21:14:29.651164 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:29.651255 master-0 kubenswrapper[7689]: I0307 21:14:29.651231 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:29.651307 master-0 kubenswrapper[7689]: I0307 21:14:29.651279 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:29.651355 master-0 kubenswrapper[7689]: I0307 21:14:29.651331 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:29.651394 master-0 kubenswrapper[7689]: I0307 21:14:29.651381 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:29.651473 master-0 kubenswrapper[7689]: I0307 21:14:29.651435 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:29.651523 master-0 kubenswrapper[7689]: I0307 21:14:29.651485 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:29.651726 master-0 kubenswrapper[7689]: E0307 21:14:29.651668 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:29.651808 master-0 kubenswrapper[7689]: E0307 21:14:29.651770 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.651747959 +0000 UTC m=+4.204074851 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:29.651912 master-0 kubenswrapper[7689]: E0307 21:14:29.651873 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:29.652005 master-0 kubenswrapper[7689]: E0307 21:14:29.651888 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:29.652005 master-0 kubenswrapper[7689]: I0307 21:14:29.651923 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:29.652302 master-0 kubenswrapper[7689]: E0307 21:14:29.652015 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.651947155 +0000 UTC m=+4.204274057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:29.652302 master-0 kubenswrapper[7689]: E0307 21:14:29.652036 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652027477 +0000 UTC m=+4.204354379 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:29.652302 master-0 kubenswrapper[7689]: E0307 21:14:29.652248 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:29.652389 master-0 kubenswrapper[7689]: E0307 21:14:29.652276 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:29.652427 master-0 kubenswrapper[7689]: E0307 21:14:29.652390 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:29.652427 master-0 kubenswrapper[7689]: E0307 21:14:29.652299 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652282474 +0000 UTC m=+4.204609406 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:29.652427 master-0 kubenswrapper[7689]: E0307 21:14:29.652338 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:29.652526 master-0 kubenswrapper[7689]: E0307 21:14:29.652444 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652431497 +0000 UTC m=+4.204758389 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:29.652526 master-0 kubenswrapper[7689]: E0307 21:14:29.652458 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652452928 +0000 UTC m=+4.204779820 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:29.652526 master-0 kubenswrapper[7689]: E0307 21:14:29.652472 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652467158 +0000 UTC m=+4.204794050 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:29.652526 master-0 kubenswrapper[7689]: E0307 21:14:29.652487 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:29.652703 master-0 kubenswrapper[7689]: E0307 21:14:29.652553 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.6525189 +0000 UTC m=+4.204845832 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:29.652703 master-0 kubenswrapper[7689]: E0307 21:14:29.652564 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:29.652703 master-0 kubenswrapper[7689]: E0307 21:14:29.652603 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652590731 +0000 UTC m=+4.204917843 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:29.652703 master-0 kubenswrapper[7689]: E0307 21:14:29.652634 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:29.652832 master-0 kubenswrapper[7689]: E0307 21:14:29.652768 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652678104 +0000 UTC m=+4.205005036 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:29.652832 master-0 kubenswrapper[7689]: E0307 21:14:29.652773 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:29.652897 master-0 kubenswrapper[7689]: E0307 21:14:29.652856 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:30.652834238 +0000 UTC m=+4.205161310 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:29.665874 master-0 kubenswrapper[7689]: I0307 21:14:29.665830 7689 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 21:14:29.675104 master-0 kubenswrapper[7689]: I0307 21:14:29.675009 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:29.916224 master-0 kubenswrapper[7689]: I0307 21:14:29.916183 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:30.185057 master-0 kubenswrapper[7689]: I0307 21:14:30.184853 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:30.188771 master-0 kubenswrapper[7689]: I0307 21:14:30.188704 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:30.354986 master-0 kubenswrapper[7689]: E0307 21:14:30.354865 7689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3" Mar 07 21:14:30.355284 master-0 kubenswrapper[7689]: E0307 21:14:30.355160 7689 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:etcd-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,Command:[cluster-etcd-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml --terminate-on-files=/var/run/secrets/serving-cert/tls.crt --terminate-on-files=/var/run/secrets/serving-cert/tls.key --terminate-on-files=/var/run/secrets/etcd-client/tls.crt --terminate-on-files=/var/run/secrets/etcd-client/tls.key --terminate-on-files=/var/run/configmaps/etcd-ca/ca-bundle.crt --terminate-on-files=/var/run/configmaps/etcd-service-ca/service-ca.crt],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPENSHIFT_PROFILE,Value:web,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-service-ca,ReadOnly:false,MountPath:/var/run/configmaps/etcd-service-ca,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etcd-client,ReadOnly:false,MountPath:/var/run/secrets/etcd-client,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-6qskh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:30,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod etcd-operator-5884b9cd56-lc94h_openshift-etcd-operator(5f82d4aa-0cb5-477f-944e-745a21d124fc): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 21:14:30.356422 master-0 kubenswrapper[7689]: E0307 21:14:30.356355 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"etcd-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" podUID="5f82d4aa-0cb5-477f-944e-745a21d124fc" Mar 07 21:14:30.662991 master-0 kubenswrapper[7689]: I0307 21:14:30.662915 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:30.663269 master-0 kubenswrapper[7689]: E0307 21:14:30.663103 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:30.663375 master-0 kubenswrapper[7689]: I0307 21:14:30.663327 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:30.663445 master-0 kubenswrapper[7689]: E0307 21:14:30.663418 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.663392057 +0000 UTC m=+6.215719149 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:30.663509 master-0 kubenswrapper[7689]: E0307 21:14:30.663459 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:30.663563 master-0 kubenswrapper[7689]: E0307 21:14:30.663542 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.66352964 +0000 UTC m=+6.215856732 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:30.664634 master-0 kubenswrapper[7689]: I0307 21:14:30.664603 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:30.664667 master-0 kubenswrapper[7689]: I0307 21:14:30.664650 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:30.664738 master-0 kubenswrapper[7689]: I0307 21:14:30.664719 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:30.664769 master-0 kubenswrapper[7689]: I0307 21:14:30.664747 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:30.664809 master-0 kubenswrapper[7689]: E0307 21:14:30.664784 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:30.664841 master-0 kubenswrapper[7689]: I0307 21:14:30.664817 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:30.664870 master-0 kubenswrapper[7689]: E0307 21:14:30.664825 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.664815794 +0000 UTC m=+6.217142686 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:30.664901 master-0 kubenswrapper[7689]: E0307 21:14:30.664870 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:30.664901 master-0 kubenswrapper[7689]: I0307 21:14:30.664883 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:30.664959 master-0 kubenswrapper[7689]: E0307 21:14:30.664892 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.664886586 +0000 UTC m=+6.217213478 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:30.664992 master-0 kubenswrapper[7689]: I0307 21:14:30.664959 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:30.664992 master-0 kubenswrapper[7689]: E0307 21:14:30.664979 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:30.664992 master-0 kubenswrapper[7689]: I0307 21:14:30.664984 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:30.665077 master-0 kubenswrapper[7689]: E0307 21:14:30.665004 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.664997598 +0000 UTC m=+6.217324490 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:30.665077 master-0 kubenswrapper[7689]: E0307 21:14:30.664934 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:30.665148 master-0 kubenswrapper[7689]: E0307 21:14:30.665079 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:30.665148 master-0 kubenswrapper[7689]: E0307 21:14:30.665100 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665094291 +0000 UTC m=+6.217421183 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:30.665148 master-0 kubenswrapper[7689]: E0307 21:14:30.665115 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:30.665229 master-0 kubenswrapper[7689]: E0307 21:14:30.665143 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:30.665261 master-0 kubenswrapper[7689]: I0307 21:14:30.665030 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:30.665261 master-0 kubenswrapper[7689]: E0307 21:14:30.665157 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665145582 +0000 UTC m=+6.217472704 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:30.665325 master-0 kubenswrapper[7689]: E0307 21:14:30.665277 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665267225 +0000 UTC m=+6.217594107 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:30.665325 master-0 kubenswrapper[7689]: E0307 21:14:30.665176 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:30.665325 master-0 kubenswrapper[7689]: E0307 21:14:30.665304 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665298366 +0000 UTC m=+6.217625258 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:30.665410 master-0 kubenswrapper[7689]: E0307 21:14:30.665354 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:30.665410 master-0 kubenswrapper[7689]: E0307 21:14:30.665385 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665375528 +0000 UTC m=+6.217702410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:30.665410 master-0 kubenswrapper[7689]: I0307 21:14:30.665401 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:30.665509 master-0 kubenswrapper[7689]: I0307 21:14:30.665425 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:30.665509 master-0 kubenswrapper[7689]: I0307 21:14:30.665453 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:30.665509 master-0 kubenswrapper[7689]: I0307 21:14:30.665477 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:30.665509 master-0 kubenswrapper[7689]: E0307 21:14:30.665494 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665550 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665510301 +0000 UTC m=+6.217837193 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665552 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665584 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665586 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665577723 +0000 UTC m=+6.217904615 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665614 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665606964 +0000 UTC m=+6.217933856 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:30.665637 master-0 kubenswrapper[7689]: E0307 21:14:30.665628 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:30.665865 master-0 kubenswrapper[7689]: E0307 21:14:30.665650 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.665643285 +0000 UTC m=+6.217970177 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:30.665992 master-0 kubenswrapper[7689]: E0307 21:14:30.665911 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:32.66582609 +0000 UTC m=+6.218153022 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:31.139926 master-0 kubenswrapper[7689]: E0307 21:14:31.139705 7689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953" Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: E0307 21:14:31.140089 7689 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: container &Container{Name:authentication-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953,Command:[/bin/bash -ec],Args:[if [ -s /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt ]; then Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: echo "Copying system trust bundle" Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: cp -f /var/run/configmaps/trusted-ca-bundle/ca-bundle.crt /etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: fi Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: exec authentication-operator operator --config=/var/run/configmaps/config/operator-config.yaml --v=2 --terminate-on-files=/var/run/configmaps/trusted-ca-bundle/ca-bundle.crt --terminate-on-files=/tmp/terminate Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:IMAGE_OAUTH_SERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3d3571ade02a7c61123d62c53fda6a57031a52c058c0571759dc09f96b23978f,ValueFrom:nil,},EnvVar{Name:IMAGE_OAUTH_APISERVER,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_OAUTH_SERVER_IMAGE_VERSION,Value:4.18.34_openshift,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{209715200 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:trusted-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/trusted-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:service-ca-bundle,ReadOnly:true,MountPath:/var/run/configmaps/service-ca-bundle,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gnnlw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:healthz,Port:{0 8443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:30,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod authentication-operator-7c6989d6c4-7w8wf_openshift-authentication-operator(ff7c5ff2-49d2-4a55-96d1-5244ae8ad602): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled Mar 07 21:14:31.140858 master-0 kubenswrapper[7689]: > logger="UnhandledError" Mar 07 21:14:31.141485 master-0 kubenswrapper[7689]: E0307 21:14:31.141317 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"authentication-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" podUID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" Mar 07 21:14:31.502461 master-0 kubenswrapper[7689]: E0307 21:14:31.502292 7689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43" Mar 07 21:14:31.502908 master-0 kubenswrapper[7689]: E0307 21:14:31.502508 7689 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:openshift-api,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43,Command:[write-available-featuresets --asset-output-dir=/available-featuregates --payload-version=$(OPERATOR_IMAGE_VERSION)],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:available-featuregates,ReadOnly:false,MountPath:/available-featuregates,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2w44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-config-operator-64488f9d78-cb227_openshift-config-operator(29624e4f-d970-4dfa-a8f1-515b73397c8f): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 21:14:31.504044 master-0 kubenswrapper[7689]: E0307 21:14:31.503972 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-api\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" Mar 07 21:14:31.530800 master-0 kubenswrapper[7689]: I0307 21:14:31.530736 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:31.538080 master-0 kubenswrapper[7689]: I0307 21:14:31.538030 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:14:32.129414 master-0 kubenswrapper[7689]: E0307 21:14:32.129333 7689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba" Mar 07 21:14:32.129772 master-0 kubenswrapper[7689]: E0307 21:14:32.129559 7689 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:service-ca-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,Command:[service-ca-operator operator],Args:[--config=/var/run/configmaps/config/operator-config.yaml -v=2],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{83886080 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zb5zm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod service-ca-operator-69b6fc6b88-cg9rz_openshift-service-ca-operator(24f69689-ff12-4786-af05-61429e9eadf8): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 21:14:32.131086 master-0 kubenswrapper[7689]: E0307 21:14:32.131041 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"service-ca-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" podUID="24f69689-ff12-4786-af05-61429e9eadf8" Mar 07 21:14:32.664339 master-0 kubenswrapper[7689]: E0307 21:14:32.664268 7689 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab" Mar 07 21:14:32.664917 master-0 kubenswrapper[7689]: E0307 21:14:32.664447 7689 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:openshift-apiserver-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab,Command:[cluster-openshift-apiserver-operator operator],Args:[--config=/var/run/configmaps/config/config.yaml],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:8443,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab,ValueFrom:nil,},EnvVar{Name:OPERATOR_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:OPERAND_IMAGE_VERSION,Value:4.18.34,ValueFrom:nil,},EnvVar{Name:KUBE_APISERVER_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:false,MountPath:/var/run/configmaps/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:serving-cert,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-kqwrr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod openshift-apiserver-operator-799b6db4d7-jtbd6_openshift-apiserver-operator(b88c5fbe-e19f-45b3-ab03-e1626f95776d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 07 21:14:32.665706 master-0 kubenswrapper[7689]: E0307 21:14:32.665645 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"openshift-apiserver-operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" podUID="b88c5fbe-e19f-45b3-ab03-e1626f95776d" Mar 07 21:14:32.692230 master-0 kubenswrapper[7689]: I0307 21:14:32.691649 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:32.692481 master-0 kubenswrapper[7689]: I0307 21:14:32.692251 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:32.692481 master-0 kubenswrapper[7689]: E0307 21:14:32.692007 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:32.692481 master-0 kubenswrapper[7689]: E0307 21:14:32.692413 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.69238057 +0000 UTC m=+10.244707462 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:32.692481 master-0 kubenswrapper[7689]: I0307 21:14:32.692304 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:32.692624 master-0 kubenswrapper[7689]: I0307 21:14:32.692515 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:32.692624 master-0 kubenswrapper[7689]: E0307 21:14:32.692518 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:32.692624 master-0 kubenswrapper[7689]: I0307 21:14:32.692549 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:32.692624 master-0 kubenswrapper[7689]: I0307 21:14:32.692589 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:32.693037 master-0 kubenswrapper[7689]: E0307 21:14:32.692632 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.692596595 +0000 UTC m=+10.244923527 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:32.693037 master-0 kubenswrapper[7689]: I0307 21:14:32.692707 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:32.693037 master-0 kubenswrapper[7689]: E0307 21:14:32.692934 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:32.693037 master-0 kubenswrapper[7689]: I0307 21:14:32.692981 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:32.693037 master-0 kubenswrapper[7689]: E0307 21:14:32.693006 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.692980315 +0000 UTC m=+10.245307207 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: E0307 21:14:32.693058 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: E0307 21:14:32.693085 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693076589 +0000 UTC m=+10.245403611 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: I0307 21:14:32.693056 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: I0307 21:14:32.693118 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: I0307 21:14:32.693146 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: E0307 21:14:32.693151 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: E0307 21:14:32.693176 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: I0307 21:14:32.693173 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:32.693223 master-0 kubenswrapper[7689]: E0307 21:14:32.693228 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693246 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693183471 +0000 UTC m=+10.245510553 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693267 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693258123 +0000 UTC m=+10.245585015 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693283 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693289 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: I0307 21:14:32.693314 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693324 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693308684 +0000 UTC m=+10.245635606 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693397 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693410 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693400017 +0000 UTC m=+10.245726909 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693414 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: I0307 21:14:32.693426 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693439 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693424397 +0000 UTC m=+10.245751329 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693457 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693449188 +0000 UTC m=+10.245776320 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: I0307 21:14:32.693472 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693480 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693468409 +0000 UTC m=+10.245795541 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693352 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693499 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693512 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693504769 +0000 UTC m=+10.245831921 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:32.693525 master-0 kubenswrapper[7689]: E0307 21:14:32.693541 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.69352955 +0000 UTC m=+10.245856472 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:32.695074 master-0 kubenswrapper[7689]: E0307 21:14:32.693572 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:32.695074 master-0 kubenswrapper[7689]: E0307 21:14:32.693599 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693591942 +0000 UTC m=+10.245919094 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:32.695074 master-0 kubenswrapper[7689]: E0307 21:14:32.693600 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:32.695074 master-0 kubenswrapper[7689]: E0307 21:14:32.693630 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:36.693622932 +0000 UTC m=+10.245949824 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:32.876354 master-0 kubenswrapper[7689]: I0307 21:14:32.876271 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-fr4qr"] Mar 07 21:14:33.431190 master-0 kubenswrapper[7689]: W0307 21:14:33.431105 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15270349_f3aa_43bc_88a8_f0fff3aa2528.slice/crio-29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0 WatchSource:0}: Error finding container 29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0: Status 404 returned error can't find the container with id 29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0 Mar 07 21:14:33.565750 master-0 kubenswrapper[7689]: I0307 21:14:33.565639 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:33.607941 master-0 kubenswrapper[7689]: I0307 21:14:33.607882 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:33.725608 master-0 kubenswrapper[7689]: I0307 21:14:33.725452 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fr4qr" event={"ID":"15270349-f3aa-43bc-88a8-f0fff3aa2528","Type":"ContainerStarted","Data":"29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0"} Mar 07 21:14:33.725608 master-0 kubenswrapper[7689]: I0307 21:14:33.725511 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:33.725608 master-0 kubenswrapper[7689]: I0307 21:14:33.725550 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:34.337295 master-0 kubenswrapper[7689]: I0307 21:14:34.336191 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:34.337295 master-0 kubenswrapper[7689]: I0307 21:14:34.336406 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:34.341544 master-0 kubenswrapper[7689]: I0307 21:14:34.341219 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:34.728029 master-0 kubenswrapper[7689]: I0307 21:14:34.727970 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:36.291719 master-0 kubenswrapper[7689]: I0307 21:14:36.291297 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:36.292762 master-0 kubenswrapper[7689]: I0307 21:14:36.292748 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:36.292863 master-0 kubenswrapper[7689]: I0307 21:14:36.292851 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:36.328079 master-0 kubenswrapper[7689]: I0307 21:14:36.327976 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:36.739074 master-0 kubenswrapper[7689]: I0307 21:14:36.738994 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" event={"ID":"bd633b72-3d0b-4601-a2c2-3f487d943b35","Type":"ContainerStarted","Data":"8db5d27113ab5fae894c6cc0107da033c6196250dc7c341eeb4aaf2ff2d3a924"} Mar 07 21:14:36.741437 master-0 kubenswrapper[7689]: I0307 21:14:36.741360 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-fr4qr" event={"ID":"15270349-f3aa-43bc-88a8-f0fff3aa2528","Type":"ContainerStarted","Data":"46a40161d8d321f4a27dcbd691ff6b848c421a9b175b3bd4556e882539a23c95"} Mar 07 21:14:36.741556 master-0 kubenswrapper[7689]: I0307 21:14:36.741481 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:14:36.743464 master-0 kubenswrapper[7689]: I0307 21:14:36.743424 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" event={"ID":"e543d99f-e0dc-49be-95bd-c39eabd05ce8","Type":"ContainerStarted","Data":"bb9512b327c952122a6ba9c90bf697a16d6d7a153e8ba4baf488a717c15e85eb"} Mar 07 21:14:36.745557 master-0 kubenswrapper[7689]: I0307 21:14:36.745515 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerStarted","Data":"98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429"} Mar 07 21:14:36.747857 master-0 kubenswrapper[7689]: I0307 21:14:36.747821 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" event={"ID":"ab2f6566-730d-46f5-92ed-79e3039d24e8","Type":"ContainerStarted","Data":"6cccee54a91d1198afbca96aa8060f7dbbf6cd82c693fb0dbe258a47b31e07b2"} Mar 07 21:14:36.751251 master-0 kubenswrapper[7689]: I0307 21:14:36.751188 7689 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="2469613253a6fb83e25c2520824b0decb6e3207bc68cca5286a33e44b6873206" exitCode=0 Mar 07 21:14:36.751337 master-0 kubenswrapper[7689]: I0307 21:14:36.751247 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerDied","Data":"2469613253a6fb83e25c2520824b0decb6e3207bc68cca5286a33e44b6873206"} Mar 07 21:14:36.754606 master-0 kubenswrapper[7689]: I0307 21:14:36.753780 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:36.755914 master-0 kubenswrapper[7689]: I0307 21:14:36.753674 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" event={"ID":"3faedef9-d507-48aa-82a8-f3dc9b5adeef","Type":"ContainerStarted","Data":"f737e30d954aa064b6cfef3a212e4d7f5057ece37e1afcdb2a92dd75d8adab26"} Mar 07 21:14:36.761376 master-0 kubenswrapper[7689]: I0307 21:14:36.761301 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:36.761489 master-0 kubenswrapper[7689]: I0307 21:14:36.761390 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:36.761489 master-0 kubenswrapper[7689]: I0307 21:14:36.761431 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:36.761489 master-0 kubenswrapper[7689]: I0307 21:14:36.761465 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:36.761627 master-0 kubenswrapper[7689]: I0307 21:14:36.761506 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:36.761627 master-0 kubenswrapper[7689]: I0307 21:14:36.761542 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:36.761627 master-0 kubenswrapper[7689]: I0307 21:14:36.761581 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:36.761627 master-0 kubenswrapper[7689]: I0307 21:14:36.761619 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:36.761808 master-0 kubenswrapper[7689]: I0307 21:14:36.761666 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:36.761808 master-0 kubenswrapper[7689]: I0307 21:14:36.761730 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:36.761808 master-0 kubenswrapper[7689]: I0307 21:14:36.761772 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762008 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762028 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762091 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762064831 +0000 UTC m=+18.314391733 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762138 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762111473 +0000 UTC m=+18.314438365 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762194 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762222 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762215725 +0000 UTC m=+18.314542617 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762268 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762292 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762284007 +0000 UTC m=+18.314610899 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762331 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762353 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762347058 +0000 UTC m=+18.314673950 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762390 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762413 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.76240671 +0000 UTC m=+18.314733602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762453 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762471 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762465972 +0000 UTC m=+18.314792864 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762515 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:36.762526 master-0 kubenswrapper[7689]: E0307 21:14:36.762534 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762528413 +0000 UTC m=+18.314855305 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762569 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762590 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762584565 +0000 UTC m=+18.314911457 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: I0307 21:14:36.762632 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762798 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762867 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762896 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762866302 +0000 UTC m=+18.315193234 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762803 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.762927 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.762913043 +0000 UTC m=+18.315239965 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: I0307 21:14:36.762987 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: I0307 21:14:36.763025 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: I0307 21:14:36.763059 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:36.763175 master-0 kubenswrapper[7689]: E0307 21:14:36.763175 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763220 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.7632055 +0000 UTC m=+18.315532602 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763289 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763323 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.763311374 +0000 UTC m=+18.315638506 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763346 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.763335845 +0000 UTC m=+18.315663017 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763416 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:36.763624 master-0 kubenswrapper[7689]: E0307 21:14:36.763451 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:14:44.763439347 +0000 UTC m=+18.315766259 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:37.014303 master-0 kubenswrapper[7689]: I0307 21:14:37.011591 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9"] Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: E0307 21:14:37.018935 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.018980 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: E0307 21:14:37.018991 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.018999 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.019086 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.019103 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.019622 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.019760 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.019773 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.020464 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:14:37.021775 master-0 kubenswrapper[7689]: I0307 21:14:37.021392 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:37.036776 master-0 kubenswrapper[7689]: I0307 21:14:37.034847 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 21:14:37.036776 master-0 kubenswrapper[7689]: I0307 21:14:37.035746 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 21:14:37.049652 master-0 kubenswrapper[7689]: I0307 21:14:37.049597 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9"] Mar 07 21:14:37.107644 master-0 kubenswrapper[7689]: I0307 21:14:37.107217 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp"] Mar 07 21:14:37.108438 master-0 kubenswrapper[7689]: I0307 21:14:37.108125 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:14:37.126292 master-0 kubenswrapper[7689]: I0307 21:14:37.126223 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp"] Mar 07 21:14:37.130474 master-0 kubenswrapper[7689]: I0307 21:14:37.130418 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:37.141276 master-0 kubenswrapper[7689]: I0307 21:14:37.141063 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:14:37.177792 master-0 kubenswrapper[7689]: I0307 21:14:37.177248 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zppz\" (UniqueName: \"kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz\") pod \"migrator-57ccdf9b5-5l6h9\" (UID: \"e38fc940-e59a-45ff-978b-fdcdc534a2a5\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:14:37.279115 master-0 kubenswrapper[7689]: I0307 21:14:37.278948 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zppz\" (UniqueName: \"kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz\") pod \"migrator-57ccdf9b5-5l6h9\" (UID: \"e38fc940-e59a-45ff-978b-fdcdc534a2a5\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:14:37.279563 master-0 kubenswrapper[7689]: I0307 21:14:37.279499 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nnk5\" (UniqueName: \"kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5\") pod \"csi-snapshot-controller-7577d6f48-kzjmp\" (UID: \"7fa7b789-9201-493e-a96d-484a2622301a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:14:37.302631 master-0 kubenswrapper[7689]: I0307 21:14:37.302538 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zppz\" (UniqueName: \"kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz\") pod \"migrator-57ccdf9b5-5l6h9\" (UID: \"e38fc940-e59a-45ff-978b-fdcdc534a2a5\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:14:37.376221 master-0 kubenswrapper[7689]: I0307 21:14:37.376136 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:14:37.381254 master-0 kubenswrapper[7689]: I0307 21:14:37.381176 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nnk5\" (UniqueName: \"kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5\") pod \"csi-snapshot-controller-7577d6f48-kzjmp\" (UID: \"7fa7b789-9201-493e-a96d-484a2622301a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:14:37.401151 master-0 kubenswrapper[7689]: I0307 21:14:37.401105 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nnk5\" (UniqueName: \"kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5\") pod \"csi-snapshot-controller-7577d6f48-kzjmp\" (UID: \"7fa7b789-9201-493e-a96d-484a2622301a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:14:37.433538 master-0 kubenswrapper[7689]: I0307 21:14:37.433179 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:14:37.609483 master-0 kubenswrapper[7689]: I0307 21:14:37.609421 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9"] Mar 07 21:14:37.614169 master-0 kubenswrapper[7689]: W0307 21:14:37.614097 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode38fc940_e59a_45ff_978b_fdcdc534a2a5.slice/crio-a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473 WatchSource:0}: Error finding container a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473: Status 404 returned error can't find the container with id a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473 Mar 07 21:14:37.634495 master-0 kubenswrapper[7689]: I0307 21:14:37.634430 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp"] Mar 07 21:14:37.658911 master-0 kubenswrapper[7689]: W0307 21:14:37.658822 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fa7b789_9201_493e_a96d_484a2622301a.slice/crio-7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187 WatchSource:0}: Error finding container 7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187: Status 404 returned error can't find the container with id 7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187 Mar 07 21:14:37.761870 master-0 kubenswrapper[7689]: I0307 21:14:37.761763 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerStarted","Data":"7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187"} Mar 07 21:14:37.763029 master-0 kubenswrapper[7689]: I0307 21:14:37.762954 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" event={"ID":"e38fc940-e59a-45ff-978b-fdcdc534a2a5","Type":"ContainerStarted","Data":"a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473"} Mar 07 21:14:37.866546 master-0 kubenswrapper[7689]: I0307 21:14:37.865987 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-852rp"] Mar 07 21:14:37.866916 master-0 kubenswrapper[7689]: I0307 21:14:37.866702 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:37.871307 master-0 kubenswrapper[7689]: I0307 21:14:37.870101 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:14:37.871307 master-0 kubenswrapper[7689]: I0307 21:14:37.870780 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:14:37.871307 master-0 kubenswrapper[7689]: I0307 21:14:37.871022 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:14:37.872495 master-0 kubenswrapper[7689]: I0307 21:14:37.871406 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:14:37.872495 master-0 kubenswrapper[7689]: I0307 21:14:37.871527 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:14:37.872495 master-0 kubenswrapper[7689]: I0307 21:14:37.871650 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:14:37.890182 master-0 kubenswrapper[7689]: I0307 21:14:37.890125 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-852rp"] Mar 07 21:14:37.995006 master-0 kubenswrapper[7689]: I0307 21:14:37.994905 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:37.995327 master-0 kubenswrapper[7689]: I0307 21:14:37.995078 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:37.995327 master-0 kubenswrapper[7689]: I0307 21:14:37.995139 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:37.995327 master-0 kubenswrapper[7689]: I0307 21:14:37.995310 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vh52\" (UniqueName: \"kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:37.995663 master-0 kubenswrapper[7689]: I0307 21:14:37.995376 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.096627 master-0 kubenswrapper[7689]: I0307 21:14:38.096527 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.096907 master-0 kubenswrapper[7689]: E0307 21:14:38.096819 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 07 21:14:38.096991 master-0 kubenswrapper[7689]: E0307 21:14:38.096957 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:38.596920792 +0000 UTC m=+12.149247924 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "config" not found Mar 07 21:14:38.097062 master-0 kubenswrapper[7689]: I0307 21:14:38.096950 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.097110 master-0 kubenswrapper[7689]: I0307 21:14:38.097074 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.097227 master-0 kubenswrapper[7689]: E0307 21:14:38.097185 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 07 21:14:38.097300 master-0 kubenswrapper[7689]: E0307 21:14:38.097276 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:38.597250021 +0000 UTC m=+12.149576923 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "openshift-global-ca" not found Mar 07 21:14:38.097440 master-0 kubenswrapper[7689]: E0307 21:14:38.097383 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:38.097497 master-0 kubenswrapper[7689]: I0307 21:14:38.097473 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vh52\" (UniqueName: \"kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.097559 master-0 kubenswrapper[7689]: E0307 21:14:38.097486 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:38.597460976 +0000 UTC m=+12.149787908 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "client-ca" not found Mar 07 21:14:38.098068 master-0 kubenswrapper[7689]: I0307 21:14:38.097906 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.098208 master-0 kubenswrapper[7689]: E0307 21:14:38.098160 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:38.098309 master-0 kubenswrapper[7689]: E0307 21:14:38.098268 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:38.598244697 +0000 UTC m=+12.150571629 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : secret "serving-cert" not found Mar 07 21:14:38.131401 master-0 kubenswrapper[7689]: I0307 21:14:38.131222 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vh52\" (UniqueName: \"kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.604398 master-0 kubenswrapper[7689]: I0307 21:14:38.604343 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.604532 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.604635 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.604608125 +0000 UTC m=+13.156935017 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : secret "serving-cert" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.604718 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/config: configmap "config" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.604784 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.604764849 +0000 UTC m=+13.157091741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "config" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: I0307 21:14:38.604547 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: I0307 21:14:38.604846 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: I0307 21:14:38.604885 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.605043 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: configmap "openshift-global-ca" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.605134 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.605114018 +0000 UTC m=+13.157440910 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "openshift-global-ca" not found Mar 07 21:14:38.605184 master-0 kubenswrapper[7689]: E0307 21:14:38.605151 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:38.605476 master-0 kubenswrapper[7689]: E0307 21:14:38.605265 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.605255341 +0000 UTC m=+13.157582233 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "client-ca" not found Mar 07 21:14:38.862795 master-0 kubenswrapper[7689]: I0307 21:14:38.862071 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-852rp"] Mar 07 21:14:38.862795 master-0 kubenswrapper[7689]: E0307 21:14:38.862434 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca config proxy-ca-bundles serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" podUID="6b5490a6-7bc4-4b1b-a049-a02beb9520f9" Mar 07 21:14:38.862795 master-0 kubenswrapper[7689]: I0307 21:14:38.862671 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l"] Mar 07 21:14:38.866704 master-0 kubenswrapper[7689]: I0307 21:14:38.863613 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:38.871277 master-0 kubenswrapper[7689]: I0307 21:14:38.866964 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l"] Mar 07 21:14:38.871277 master-0 kubenswrapper[7689]: I0307 21:14:38.868317 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:14:38.871277 master-0 kubenswrapper[7689]: I0307 21:14:38.868804 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:14:38.871277 master-0 kubenswrapper[7689]: I0307 21:14:38.869061 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:14:38.879719 master-0 kubenswrapper[7689]: I0307 21:14:38.878444 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:14:38.884305 master-0 kubenswrapper[7689]: I0307 21:14:38.881221 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:14:39.012254 master-0 kubenswrapper[7689]: I0307 21:14:39.011458 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.012525 master-0 kubenswrapper[7689]: I0307 21:14:39.012307 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.012525 master-0 kubenswrapper[7689]: I0307 21:14:39.012355 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.012793 master-0 kubenswrapper[7689]: I0307 21:14:39.012743 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qflsf\" (UniqueName: \"kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.113869 master-0 kubenswrapper[7689]: I0307 21:14:39.113662 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qflsf\" (UniqueName: \"kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.113869 master-0 kubenswrapper[7689]: I0307 21:14:39.113779 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.113869 master-0 kubenswrapper[7689]: I0307 21:14:39.113804 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.113869 master-0 kubenswrapper[7689]: I0307 21:14:39.113824 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.114360 master-0 kubenswrapper[7689]: E0307 21:14:39.114189 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:39.114360 master-0 kubenswrapper[7689]: E0307 21:14:39.114262 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.614243098 +0000 UTC m=+13.166569990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:39.114492 master-0 kubenswrapper[7689]: E0307 21:14:39.114454 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:39.114557 master-0 kubenswrapper[7689]: E0307 21:14:39.114503 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:39.614489634 +0000 UTC m=+13.166816766 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:39.116815 master-0 kubenswrapper[7689]: I0307 21:14:39.115286 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.138868 master-0 kubenswrapper[7689]: I0307 21:14:39.133993 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qflsf\" (UniqueName: \"kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.620392 master-0 kubenswrapper[7689]: I0307 21:14:39.620222 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.621389 master-0 kubenswrapper[7689]: E0307 21:14:39.620495 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:39.621389 master-0 kubenswrapper[7689]: E0307 21:14:39.620603 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:41.620579634 +0000 UTC m=+15.172906526 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : secret "serving-cert" not found Mar 07 21:14:39.621389 master-0 kubenswrapper[7689]: I0307 21:14:39.621154 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.621783 master-0 kubenswrapper[7689]: I0307 21:14:39.621237 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: I0307 21:14:39.621873 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.622089 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.622230 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:40.622191336 +0000 UTC m=+14.174518268 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: I0307 21:14:39.622330 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: I0307 21:14:39.622408 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.622654 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.622730 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca podName:6b5490a6-7bc4-4b1b-a049-a02beb9520f9 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:41.62271542 +0000 UTC m=+15.175042352 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca") pod "controller-manager-6f7fd6c796-852rp" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9") : configmap "client-ca" not found Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: I0307 21:14:39.624208 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.624378 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:39.626412 master-0 kubenswrapper[7689]: E0307 21:14:39.624432 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:40.624413874 +0000 UTC m=+14.176740806 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:39.627940 master-0 kubenswrapper[7689]: I0307 21:14:39.627880 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"controller-manager-6f7fd6c796-852rp\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.773845 master-0 kubenswrapper[7689]: I0307 21:14:39.773759 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-n8nz9" event={"ID":"666475e5-df4b-44ef-a2d4-39d84ab91aad","Type":"ContainerStarted","Data":"f815460fb1610cde041614f22cca40bb340c9a6ffb62ee272770cd76d18b08bd"} Mar 07 21:14:39.774649 master-0 kubenswrapper[7689]: I0307 21:14:39.773793 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.786490 master-0 kubenswrapper[7689]: I0307 21:14:39.786411 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:39.930031 master-0 kubenswrapper[7689]: I0307 21:14:39.929846 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") pod \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " Mar 07 21:14:39.930031 master-0 kubenswrapper[7689]: I0307 21:14:39.929955 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") pod \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " Mar 07 21:14:39.930031 master-0 kubenswrapper[7689]: I0307 21:14:39.929985 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vh52\" (UniqueName: \"kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52\") pod \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\" (UID: \"6b5490a6-7bc4-4b1b-a049-a02beb9520f9\") " Mar 07 21:14:39.932069 master-0 kubenswrapper[7689]: I0307 21:14:39.930828 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6b5490a6-7bc4-4b1b-a049-a02beb9520f9" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:14:39.932069 master-0 kubenswrapper[7689]: I0307 21:14:39.930487 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config" (OuterVolumeSpecName: "config") pod "6b5490a6-7bc4-4b1b-a049-a02beb9520f9" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:14:39.935529 master-0 kubenswrapper[7689]: I0307 21:14:39.935437 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52" (OuterVolumeSpecName: "kube-api-access-6vh52") pod "6b5490a6-7bc4-4b1b-a049-a02beb9520f9" (UID: "6b5490a6-7bc4-4b1b-a049-a02beb9520f9"). InnerVolumeSpecName "kube-api-access-6vh52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:14:40.031775 master-0 kubenswrapper[7689]: I0307 21:14:40.031628 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:40.031775 master-0 kubenswrapper[7689]: I0307 21:14:40.031673 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vh52\" (UniqueName: \"kubernetes.io/projected/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-kube-api-access-6vh52\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:40.031775 master-0 kubenswrapper[7689]: I0307 21:14:40.031700 7689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:40.640960 master-0 kubenswrapper[7689]: I0307 21:14:40.640833 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:40.640960 master-0 kubenswrapper[7689]: I0307 21:14:40.640919 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:40.642818 master-0 kubenswrapper[7689]: E0307 21:14:40.641244 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:40.642818 master-0 kubenswrapper[7689]: E0307 21:14:40.641251 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:40.642818 master-0 kubenswrapper[7689]: E0307 21:14:40.641366 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:42.641326029 +0000 UTC m=+16.193652961 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:40.642818 master-0 kubenswrapper[7689]: E0307 21:14:40.641658 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:42.641561625 +0000 UTC m=+16.193888557 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:40.781134 master-0 kubenswrapper[7689]: I0307 21:14:40.781037 7689 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="a4635f7548cc73236087a85660453eabf881ce7b06599d4a7dd2447ded616584" exitCode=0 Mar 07 21:14:40.781479 master-0 kubenswrapper[7689]: I0307 21:14:40.781157 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerDied","Data":"a4635f7548cc73236087a85660453eabf881ce7b06599d4a7dd2447ded616584"} Mar 07 21:14:40.794840 master-0 kubenswrapper[7689]: I0307 21:14:40.794515 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" event={"ID":"e38fc940-e59a-45ff-978b-fdcdc534a2a5","Type":"ContainerStarted","Data":"95ff386dc189d8815cac180cf2b77761087fa858873b5f018818e529f55b6cd3"} Mar 07 21:14:40.794840 master-0 kubenswrapper[7689]: I0307 21:14:40.794763 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" event={"ID":"e38fc940-e59a-45ff-978b-fdcdc534a2a5","Type":"ContainerStarted","Data":"51cfeb8f64b335f9ff5f2b8acb9028bf93c798d808c7f33050a431a359493367"} Mar 07 21:14:40.798623 master-0 kubenswrapper[7689]: I0307 21:14:40.798511 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6f7fd6c796-852rp" Mar 07 21:14:40.798623 master-0 kubenswrapper[7689]: I0307 21:14:40.798528 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerStarted","Data":"cd505551260b0980137e293c3b0596c534dcce88209069b8f3c0dc90efac996d"} Mar 07 21:14:40.831223 master-0 kubenswrapper[7689]: I0307 21:14:40.830386 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" podStartSLOduration=2.470102398 podStartE2EDuration="4.830350175s" podCreationTimestamp="2026-03-07 21:14:36 +0000 UTC" firstStartedPulling="2026-03-07 21:14:37.618049171 +0000 UTC m=+11.170376073" lastFinishedPulling="2026-03-07 21:14:39.978296958 +0000 UTC m=+13.530623850" observedRunningTime="2026-03-07 21:14:40.829485594 +0000 UTC m=+14.381812526" watchObservedRunningTime="2026-03-07 21:14:40.830350175 +0000 UTC m=+14.382677077" Mar 07 21:14:40.861176 master-0 kubenswrapper[7689]: I0307 21:14:40.861054 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-852rp"] Mar 07 21:14:40.871466 master-0 kubenswrapper[7689]: I0307 21:14:40.871381 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6f7fd6c796-852rp"] Mar 07 21:14:40.890588 master-0 kubenswrapper[7689]: I0307 21:14:40.890434 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" podStartSLOduration=1.566995674 podStartE2EDuration="3.89039476s" podCreationTimestamp="2026-03-07 21:14:37 +0000 UTC" firstStartedPulling="2026-03-07 21:14:37.662318605 +0000 UTC m=+11.214645497" lastFinishedPulling="2026-03-07 21:14:39.985717691 +0000 UTC m=+13.538044583" observedRunningTime="2026-03-07 21:14:40.887729731 +0000 UTC m=+14.440056663" watchObservedRunningTime="2026-03-07 21:14:40.89039476 +0000 UTC m=+14.442721692" Mar 07 21:14:41.050135 master-0 kubenswrapper[7689]: I0307 21:14:41.049930 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:41.050135 master-0 kubenswrapper[7689]: I0307 21:14:41.049995 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6b5490a6-7bc4-4b1b-a049-a02beb9520f9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:14:42.297083 master-0 kubenswrapper[7689]: I0307 21:14:42.297011 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-57b874d6cb-w8kbv"] Mar 07 21:14:42.298179 master-0 kubenswrapper[7689]: I0307 21:14:42.297632 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.301762 master-0 kubenswrapper[7689]: I0307 21:14:42.301539 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:14:42.301911 master-0 kubenswrapper[7689]: I0307 21:14:42.301801 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:14:42.301911 master-0 kubenswrapper[7689]: I0307 21:14:42.301812 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:14:42.301911 master-0 kubenswrapper[7689]: I0307 21:14:42.301851 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:14:42.305348 master-0 kubenswrapper[7689]: I0307 21:14:42.302623 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:14:42.325169 master-0 kubenswrapper[7689]: I0307 21:14:42.324217 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-57b874d6cb-w8kbv"] Mar 07 21:14:42.325362 master-0 kubenswrapper[7689]: I0307 21:14:42.325136 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:14:42.366061 master-0 kubenswrapper[7689]: I0307 21:14:42.365979 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnxpn\" (UniqueName: \"kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.366299 master-0 kubenswrapper[7689]: I0307 21:14:42.366180 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.366299 master-0 kubenswrapper[7689]: I0307 21:14:42.366260 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.366299 master-0 kubenswrapper[7689]: I0307 21:14:42.366287 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.366442 master-0 kubenswrapper[7689]: I0307 21:14:42.366351 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.467719 master-0 kubenswrapper[7689]: I0307 21:14:42.467613 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.467998 master-0 kubenswrapper[7689]: I0307 21:14:42.467797 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.467998 master-0 kubenswrapper[7689]: I0307 21:14:42.467838 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.468077 master-0 kubenswrapper[7689]: I0307 21:14:42.468043 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.468169 master-0 kubenswrapper[7689]: I0307 21:14:42.468121 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnxpn\" (UniqueName: \"kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.469021 master-0 kubenswrapper[7689]: E0307 21:14:42.468920 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:42.469158 master-0 kubenswrapper[7689]: E0307 21:14:42.469112 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:42.969070627 +0000 UTC m=+16.521397589 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : secret "serving-cert" not found Mar 07 21:14:42.469720 master-0 kubenswrapper[7689]: E0307 21:14:42.469633 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:42.469720 master-0 kubenswrapper[7689]: I0307 21:14:42.469665 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.469875 master-0 kubenswrapper[7689]: E0307 21:14:42.469737 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:42.969721573 +0000 UTC m=+16.522048475 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:42.470543 master-0 kubenswrapper[7689]: I0307 21:14:42.470460 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.493310 master-0 kubenswrapper[7689]: I0307 21:14:42.493237 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnxpn\" (UniqueName: \"kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.671325 master-0 kubenswrapper[7689]: I0307 21:14:42.671232 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:42.671325 master-0 kubenswrapper[7689]: I0307 21:14:42.671303 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:42.671662 master-0 kubenswrapper[7689]: E0307 21:14:42.671442 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:42.671662 master-0 kubenswrapper[7689]: E0307 21:14:42.671622 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:46.671596145 +0000 UTC m=+20.223923037 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:42.671763 master-0 kubenswrapper[7689]: E0307 21:14:42.671731 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:42.671850 master-0 kubenswrapper[7689]: E0307 21:14:42.671818 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:46.67179062 +0000 UTC m=+20.224117582 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:42.689992 master-0 kubenswrapper[7689]: I0307 21:14:42.689834 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b5490a6-7bc4-4b1b-a049-a02beb9520f9" path="/var/lib/kubelet/pods/6b5490a6-7bc4-4b1b-a049-a02beb9520f9/volumes" Mar 07 21:14:42.975226 master-0 kubenswrapper[7689]: I0307 21:14:42.975011 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.975226 master-0 kubenswrapper[7689]: I0307 21:14:42.975107 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:42.975577 master-0 kubenswrapper[7689]: E0307 21:14:42.975371 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:42.975577 master-0 kubenswrapper[7689]: E0307 21:14:42.975411 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:42.975577 master-0 kubenswrapper[7689]: E0307 21:14:42.975550 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:43.975505016 +0000 UTC m=+17.527831978 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : secret "serving-cert" not found Mar 07 21:14:42.975813 master-0 kubenswrapper[7689]: E0307 21:14:42.975594 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:43.975577758 +0000 UTC m=+17.527904690 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:43.816583 master-0 kubenswrapper[7689]: I0307 21:14:43.816492 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerStarted","Data":"e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54"} Mar 07 21:14:43.993140 master-0 kubenswrapper[7689]: I0307 21:14:43.992946 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:43.993388 master-0 kubenswrapper[7689]: E0307 21:14:43.993342 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:43.993530 master-0 kubenswrapper[7689]: E0307 21:14:43.993495 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:45.993455167 +0000 UTC m=+19.545782099 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : secret "serving-cert" not found Mar 07 21:14:43.993530 master-0 kubenswrapper[7689]: I0307 21:14:43.993364 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:43.994167 master-0 kubenswrapper[7689]: E0307 21:14:43.994124 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:43.994324 master-0 kubenswrapper[7689]: E0307 21:14:43.994240 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:45.994211477 +0000 UTC m=+19.546538399 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:44.806220 master-0 kubenswrapper[7689]: I0307 21:14:44.805665 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:14:44.806220 master-0 kubenswrapper[7689]: I0307 21:14:44.806183 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:44.806220 master-0 kubenswrapper[7689]: E0307 21:14:44.805864 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:14:44.806839 master-0 kubenswrapper[7689]: E0307 21:14:44.806289 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.806263823 +0000 UTC m=+34.358590715 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:14:44.806839 master-0 kubenswrapper[7689]: E0307 21:14:44.806365 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/performance-addon-operator-webhook-cert: secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:44.806839 master-0 kubenswrapper[7689]: E0307 21:14:44.806419 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.806407106 +0000 UTC m=+34.358733998 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "performance-addon-operator-webhook-cert" not found Mar 07 21:14:44.806839 master-0 kubenswrapper[7689]: I0307 21:14:44.806833 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:44.807143 master-0 kubenswrapper[7689]: E0307 21:14:44.806944 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-operator-tls: secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:44.807143 master-0 kubenswrapper[7689]: E0307 21:14:44.806980 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.806970181 +0000 UTC m=+34.359297073 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-baremetal-operator-tls" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-operator-tls" not found Mar 07 21:14:44.807143 master-0 kubenswrapper[7689]: I0307 21:14:44.807017 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:14:44.807143 master-0 kubenswrapper[7689]: I0307 21:14:44.807049 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:14:44.807143 master-0 kubenswrapper[7689]: I0307 21:14:44.807078 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: I0307 21:14:44.807245 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807280 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807352 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807359 7689 secret.go:189] Couldn't get secret openshift-dns-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: I0307 21:14:44.807294 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807392 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807381522 +0000 UTC m=+34.359708414 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807434 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls podName:61a9fce6-50e1-413c-9ec0-177d6e903bdd nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807422053 +0000 UTC m=+34.359748955 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls") pod "dns-operator-589895fbb7-wqqqr" (UID: "61a9fce6-50e1-413c-9ec0-177d6e903bdd") : secret "metrics-tls" not found Mar 07 21:14:44.807441 master-0 kubenswrapper[7689]: E0307 21:14:44.807453 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807442473 +0000 UTC m=+34.359769365 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807472 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807507 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807532 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807557 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807585 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807628 7689 secret.go:189] Couldn't get secret openshift-cluster-version/cluster-version-operator-serving-cert: secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807628 7689 secret.go:189] Couldn't get secret openshift-cluster-node-tuning-operator/node-tuning-operator-tls: secret "node-tuning-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807661 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807623178 +0000 UTC m=+34.359950110 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807712 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls podName:f8c93e0d-54e5-4c80-9d69-a70317baeacf nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807674449 +0000 UTC m=+34.360001461 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "node-tuning-operator-tls" (UniqueName: "kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls") pod "cluster-node-tuning-operator-66c7586884-sxqnh" (UID: "f8c93e0d-54e5-4c80-9d69-a70317baeacf") : secret "node-tuning-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807740 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert podName:3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807730421 +0000 UTC m=+34.360057383 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert") pod "cluster-version-operator-745944c6b7-fjbl4" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b") : secret "cluster-version-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807761 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807789 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: I0307 21:14:44.807822 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807878 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807906 7689 secret.go:189] Couldn't get secret openshift-image-registry/image-registry-operator-tls: secret "image-registry-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807921 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807907995 +0000 UTC m=+34.360234887 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807939 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls podName:dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.807928876 +0000 UTC m=+34.360255768 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "image-registry-operator-tls" (UniqueName: "kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls") pod "cluster-image-registry-operator-86d6d77c7c-kg26q" (UID: "dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2") : secret "image-registry-operator-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807949 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.807998 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808017 7689 secret.go:189] Couldn't get secret openshift-machine-api/cluster-baremetal-webhook-server-cert: secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808049 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert podName:a61a736a-66e5-4ca1-a8a7-088cf73cfcce nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.808039489 +0000 UTC m=+34.360366471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert") pod "cluster-baremetal-operator-5cdb4c5598-nmwjr" (UID: "a61a736a-66e5-4ca1-a8a7-088cf73cfcce") : secret "cluster-baremetal-webhook-server-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808066 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.808057429 +0000 UTC m=+34.360384421 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808082 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.808074839 +0000 UTC m=+34.360401851 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808106 7689 secret.go:189] Couldn't get secret openshift-ingress-operator/metrics-tls: secret "metrics-tls" not found Mar 07 21:14:44.808172 master-0 kubenswrapper[7689]: E0307 21:14:44.808182 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls podName:47ecf172-666e-4360-97ff-bd9dbccc1fd6 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.808155902 +0000 UTC m=+34.360482834 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls") pod "ingress-operator-677db989d6-tklw9" (UID: "47ecf172-666e-4360-97ff-bd9dbccc1fd6") : secret "metrics-tls" not found Mar 07 21:14:44.823271 master-0 kubenswrapper[7689]: I0307 21:14:44.823180 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerStarted","Data":"ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33"} Mar 07 21:14:45.839919 master-0 kubenswrapper[7689]: I0307 21:14:45.839715 7689 generic.go:334] "Generic (PLEG): container finished" podID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerID="7b14e0d42b70cc70f5e51131b552d35ae08e2304284eb28296a108002b51512b" exitCode=0 Mar 07 21:14:45.839919 master-0 kubenswrapper[7689]: I0307 21:14:45.839860 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerDied","Data":"7b14e0d42b70cc70f5e51131b552d35ae08e2304284eb28296a108002b51512b"} Mar 07 21:14:45.843859 master-0 kubenswrapper[7689]: I0307 21:14:45.843733 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" event={"ID":"5f82d4aa-0cb5-477f-944e-745a21d124fc","Type":"ContainerStarted","Data":"42f741a1d8745f4ba4855310764e131077825a56cb2981843ca7f7c641b06c4d"} Mar 07 21:14:46.028383 master-0 kubenswrapper[7689]: I0307 21:14:46.028270 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:46.029578 master-0 kubenswrapper[7689]: E0307 21:14:46.028585 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:46.029578 master-0 kubenswrapper[7689]: I0307 21:14:46.028651 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:46.029578 master-0 kubenswrapper[7689]: E0307 21:14:46.028743 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:50.028677853 +0000 UTC m=+23.581004785 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : secret "serving-cert" not found Mar 07 21:14:46.029578 master-0 kubenswrapper[7689]: E0307 21:14:46.028860 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:46.029578 master-0 kubenswrapper[7689]: E0307 21:14:46.029297 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:50.029265578 +0000 UTC m=+23.581592510 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:46.738848 master-0 kubenswrapper[7689]: I0307 21:14:46.738740 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:46.739538 master-0 kubenswrapper[7689]: E0307 21:14:46.738959 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:46.739538 master-0 kubenswrapper[7689]: I0307 21:14:46.738981 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:46.739538 master-0 kubenswrapper[7689]: E0307 21:14:46.739082 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:54.739052508 +0000 UTC m=+28.291379480 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:46.739538 master-0 kubenswrapper[7689]: E0307 21:14:46.739136 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:46.739538 master-0 kubenswrapper[7689]: E0307 21:14:46.739232 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:14:54.739216432 +0000 UTC m=+28.291543424 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:46.874647 master-0 kubenswrapper[7689]: I0307 21:14:46.873869 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerStarted","Data":"c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232"} Mar 07 21:14:47.888606 master-0 kubenswrapper[7689]: I0307 21:14:47.887969 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" event={"ID":"b88c5fbe-e19f-45b3-ab03-e1626f95776d","Type":"ContainerStarted","Data":"4dd4ab96de66a81d1a97cd72bb912ec500681a0000024a0cfaf545c2eaf36106"} Mar 07 21:14:48.895238 master-0 kubenswrapper[7689]: I0307 21:14:48.895137 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerStarted","Data":"7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656"} Mar 07 21:14:48.896488 master-0 kubenswrapper[7689]: I0307 21:14:48.895434 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:49.285109 master-0 kubenswrapper[7689]: I0307 21:14:49.284971 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-h76wh"] Mar 07 21:14:49.285622 master-0 kubenswrapper[7689]: I0307 21:14:49.285592 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.287967 master-0 kubenswrapper[7689]: I0307 21:14:49.287917 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 21:14:49.288737 master-0 kubenswrapper[7689]: I0307 21:14:49.288662 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 21:14:49.288988 master-0 kubenswrapper[7689]: I0307 21:14:49.288925 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 21:14:49.289172 master-0 kubenswrapper[7689]: I0307 21:14:49.289146 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 21:14:49.296845 master-0 kubenswrapper[7689]: I0307 21:14:49.296797 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-h76wh"] Mar 07 21:14:49.373992 master-0 kubenswrapper[7689]: I0307 21:14:49.373866 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.373992 master-0 kubenswrapper[7689]: I0307 21:14:49.373996 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.374282 master-0 kubenswrapper[7689]: I0307 21:14:49.374087 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ds84\" (UniqueName: \"kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.475913 master-0 kubenswrapper[7689]: I0307 21:14:49.475828 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ds84\" (UniqueName: \"kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.476426 master-0 kubenswrapper[7689]: I0307 21:14:49.476383 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.476504 master-0 kubenswrapper[7689]: I0307 21:14:49.476470 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.477757 master-0 kubenswrapper[7689]: I0307 21:14:49.477709 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.486389 master-0 kubenswrapper[7689]: I0307 21:14:49.486338 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.584971 master-0 kubenswrapper[7689]: I0307 21:14:49.584774 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ds84\" (UniqueName: \"kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.617942 master-0 kubenswrapper[7689]: I0307 21:14:49.617848 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:14:49.900888 master-0 kubenswrapper[7689]: I0307 21:14:49.900824 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-84bfdbbb7f-h76wh"] Mar 07 21:14:49.907794 master-0 kubenswrapper[7689]: W0307 21:14:49.907742 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2369ce94_237f_41ad_9875_173578764483.slice/crio-f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48 WatchSource:0}: Error finding container f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48: Status 404 returned error can't find the container with id f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48 Mar 07 21:14:50.096630 master-0 kubenswrapper[7689]: I0307 21:14:50.096128 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:50.096880 master-0 kubenswrapper[7689]: I0307 21:14:50.096695 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:50.096880 master-0 kubenswrapper[7689]: E0307 21:14:50.096383 7689 secret.go:189] Couldn't get secret openshift-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:50.096880 master-0 kubenswrapper[7689]: E0307 21:14:50.096823 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:58.096788823 +0000 UTC m=+31.649115855 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : secret "serving-cert" not found Mar 07 21:14:50.096880 master-0 kubenswrapper[7689]: E0307 21:14:50.096876 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:50.097017 master-0 kubenswrapper[7689]: E0307 21:14:50.096945 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:14:58.096926167 +0000 UTC m=+31.649253049 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:50.613597 master-0 kubenswrapper[7689]: I0307 21:14:50.613521 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:14:50.908117 master-0 kubenswrapper[7689]: I0307 21:14:50.907927 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" event={"ID":"2369ce94-237f-41ad-9875-173578764483","Type":"ContainerStarted","Data":"445492e4e6d40332995014dd6be660b4fadf0d896d317c849ff3f3a4ae8887c6"} Mar 07 21:14:50.908117 master-0 kubenswrapper[7689]: I0307 21:14:50.908050 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" event={"ID":"2369ce94-237f-41ad-9875-173578764483","Type":"ContainerStarted","Data":"f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48"} Mar 07 21:14:50.926828 master-0 kubenswrapper[7689]: I0307 21:14:50.926731 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" podStartSLOduration=1.926675043 podStartE2EDuration="1.926675043s" podCreationTimestamp="2026-03-07 21:14:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:50.925103402 +0000 UTC m=+24.477430334" watchObservedRunningTime="2026-03-07 21:14:50.926675043 +0000 UTC m=+24.479001945" Mar 07 21:14:54.369783 master-0 kubenswrapper[7689]: I0307 21:14:54.368483 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:54.369783 master-0 kubenswrapper[7689]: I0307 21:14:54.368774 7689 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:14:54.396831 master-0 kubenswrapper[7689]: I0307 21:14:54.396712 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:14:54.502046 master-0 kubenswrapper[7689]: I0307 21:14:54.501980 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc"] Mar 07 21:14:54.502900 master-0 kubenswrapper[7689]: I0307 21:14:54.502877 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.504993 master-0 kubenswrapper[7689]: I0307 21:14:54.504946 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 07 21:14:54.505352 master-0 kubenswrapper[7689]: I0307 21:14:54.505321 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 07 21:14:54.507158 master-0 kubenswrapper[7689]: I0307 21:14:54.507121 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 07 21:14:54.515141 master-0 kubenswrapper[7689]: I0307 21:14:54.513782 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc"] Mar 07 21:14:54.523324 master-0 kubenswrapper[7689]: I0307 21:14:54.521533 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562144 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjt7j\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562206 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562294 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562326 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562377 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.565708 master-0 kubenswrapper[7689]: I0307 21:14:54.562434 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.623242 master-0 kubenswrapper[7689]: I0307 21:14:54.623048 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw"] Mar 07 21:14:54.623773 master-0 kubenswrapper[7689]: I0307 21:14:54.623754 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.626860 master-0 kubenswrapper[7689]: I0307 21:14:54.626817 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 07 21:14:54.627536 master-0 kubenswrapper[7689]: I0307 21:14:54.627487 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 07 21:14:54.634631 master-0 kubenswrapper[7689]: I0307 21:14:54.634584 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 07 21:14:54.639749 master-0 kubenswrapper[7689]: I0307 21:14:54.639696 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw"] Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665216 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjt7j\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665264 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665325 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jcxp\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665357 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665387 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665421 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665466 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665492 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665535 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665554 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.665706 master-0 kubenswrapper[7689]: I0307 21:14:54.665573 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.668903 master-0 kubenswrapper[7689]: I0307 21:14:54.666864 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.668903 master-0 kubenswrapper[7689]: I0307 21:14:54.667020 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.669639 master-0 kubenswrapper[7689]: I0307 21:14:54.669228 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.675048 master-0 kubenswrapper[7689]: I0307 21:14:54.675000 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.675825 master-0 kubenswrapper[7689]: I0307 21:14:54.675788 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.694415 master-0 kubenswrapper[7689]: I0307 21:14:54.694366 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjt7j\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.766921 master-0 kubenswrapper[7689]: I0307 21:14:54.766829 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:54.767146 master-0 kubenswrapper[7689]: E0307 21:14:54.766927 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:14:54.767146 master-0 kubenswrapper[7689]: E0307 21:14:54.767020 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:15:10.767001436 +0000 UTC m=+44.319328338 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : secret "serving-cert" not found Mar 07 21:14:54.767146 master-0 kubenswrapper[7689]: I0307 21:14:54.767128 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767243 master-0 kubenswrapper[7689]: I0307 21:14:54.767220 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767356 master-0 kubenswrapper[7689]: I0307 21:14:54.767319 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767408 master-0 kubenswrapper[7689]: I0307 21:14:54.767361 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767491 master-0 kubenswrapper[7689]: I0307 21:14:54.767455 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767664 master-0 kubenswrapper[7689]: I0307 21:14:54.767599 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcxp\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767664 master-0 kubenswrapper[7689]: I0307 21:14:54.767618 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767664 master-0 kubenswrapper[7689]: I0307 21:14:54.767657 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.767788 master-0 kubenswrapper[7689]: I0307 21:14:54.767735 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") pod \"route-controller-manager-dbd867658-rkw4l\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:54.767845 master-0 kubenswrapper[7689]: E0307 21:14:54.767822 7689 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:54.767899 master-0 kubenswrapper[7689]: E0307 21:14:54.767880 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca podName:d1214cdf-12a8-41ad-a8b8-b11f34ce86bf nodeName:}" failed. No retries permitted until 2026-03-07 21:15:10.767860569 +0000 UTC m=+44.320187461 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca") pod "route-controller-manager-dbd867658-rkw4l" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf") : configmap "client-ca" not found Mar 07 21:14:54.779376 master-0 kubenswrapper[7689]: I0307 21:14:54.779350 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.787244 master-0 kubenswrapper[7689]: I0307 21:14:54.787193 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcxp\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:54.839594 master-0 kubenswrapper[7689]: I0307 21:14:54.839517 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:54.937470 master-0 kubenswrapper[7689]: I0307 21:14:54.936850 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:55.075890 master-0 kubenswrapper[7689]: I0307 21:14:55.075810 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc"] Mar 07 21:14:55.587487 master-0 kubenswrapper[7689]: I0307 21:14:55.587387 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:14:55.588866 master-0 kubenswrapper[7689]: I0307 21:14:55.588477 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.589037 master-0 kubenswrapper[7689]: I0307 21:14:55.588918 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw"] Mar 07 21:14:55.591621 master-0 kubenswrapper[7689]: I0307 21:14:55.591557 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 07 21:14:55.597068 master-0 kubenswrapper[7689]: W0307 21:14:55.597028 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod183a5212_1b21_44e4_9ed5_2f63f76e652e.slice/crio-5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877 WatchSource:0}: Error finding container 5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877: Status 404 returned error can't find the container with id 5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877 Mar 07 21:14:55.686406 master-0 kubenswrapper[7689]: I0307 21:14:55.686320 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.686672 master-0 kubenswrapper[7689]: I0307 21:14:55.686610 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.686984 master-0 kubenswrapper[7689]: I0307 21:14:55.686924 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.788474 master-0 kubenswrapper[7689]: I0307 21:14:55.788186 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.788474 master-0 kubenswrapper[7689]: I0307 21:14:55.788409 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.789963 master-0 kubenswrapper[7689]: I0307 21:14:55.788620 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.789963 master-0 kubenswrapper[7689]: I0307 21:14:55.788743 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.789963 master-0 kubenswrapper[7689]: I0307 21:14:55.788867 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:55.929338 master-0 kubenswrapper[7689]: I0307 21:14:55.928828 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:14:55.947535 master-0 kubenswrapper[7689]: I0307 21:14:55.947224 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" event={"ID":"183a5212-1b21-44e4-9ed5-2f63f76e652e","Type":"ContainerStarted","Data":"5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877"} Mar 07 21:14:55.949500 master-0 kubenswrapper[7689]: I0307 21:14:55.949424 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" event={"ID":"290f6cf4-daa1-4cae-8e91-2411bf81f8b4","Type":"ContainerStarted","Data":"41d17db6ef1aab779824d6d8f3fc1ed1e33240b810e2a0cc5546789ab0269c86"} Mar 07 21:14:55.949635 master-0 kubenswrapper[7689]: I0307 21:14:55.949505 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" event={"ID":"290f6cf4-daa1-4cae-8e91-2411bf81f8b4","Type":"ContainerStarted","Data":"2a4e91956e6af4d37253ed844488126f5600b96517ef3a0ce7d67e4b637437bf"} Mar 07 21:14:56.481445 master-0 kubenswrapper[7689]: I0307 21:14:56.481275 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access\") pod \"installer-1-master-0\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:56.535070 master-0 kubenswrapper[7689]: I0307 21:14:56.535019 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:14:56.956783 master-0 kubenswrapper[7689]: I0307 21:14:56.956166 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" event={"ID":"183a5212-1b21-44e4-9ed5-2f63f76e652e","Type":"ContainerStarted","Data":"25ecc1fa40b21a7cab58bd93942c4c42ee8198373a91d03573b6a726f9d14b97"} Mar 07 21:14:56.956783 master-0 kubenswrapper[7689]: I0307 21:14:56.956769 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:14:56.956783 master-0 kubenswrapper[7689]: I0307 21:14:56.956789 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" event={"ID":"183a5212-1b21-44e4-9ed5-2f63f76e652e","Type":"ContainerStarted","Data":"8f10d93d20499c3da298c974d4861d544ee9c5bce59d8a7447d7dff84ed9c7bb"} Mar 07 21:14:56.959080 master-0 kubenswrapper[7689]: I0307 21:14:56.958992 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" event={"ID":"290f6cf4-daa1-4cae-8e91-2411bf81f8b4","Type":"ContainerStarted","Data":"229c457c12626388c83b801345b6a2d1fe3bbf16efee0d92665ab237bb56bee9"} Mar 07 21:14:56.959268 master-0 kubenswrapper[7689]: I0307 21:14:56.959220 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:14:57.326646 master-0 kubenswrapper[7689]: I0307 21:14:57.325618 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:14:57.964978 master-0 kubenswrapper[7689]: I0307 21:14:57.964912 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"2b95a709-faec-4d50-8742-935bddd84cbc","Type":"ContainerStarted","Data":"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476"} Mar 07 21:14:57.966466 master-0 kubenswrapper[7689]: I0307 21:14:57.966428 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"2b95a709-faec-4d50-8742-935bddd84cbc","Type":"ContainerStarted","Data":"fdbfce6137a81a2ff42bfd72e38c7518b24c4c85ede18658ee46b97ef6f69012"} Mar 07 21:14:58.118966 master-0 kubenswrapper[7689]: I0307 21:14:58.118826 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" podStartSLOduration=4.118780195 podStartE2EDuration="4.118780195s" podCreationTimestamp="2026-03-07 21:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:58.101215717 +0000 UTC m=+31.653542659" watchObservedRunningTime="2026-03-07 21:14:58.118780195 +0000 UTC m=+31.671107137" Mar 07 21:14:58.138076 master-0 kubenswrapper[7689]: I0307 21:14:58.137973 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:58.138370 master-0 kubenswrapper[7689]: I0307 21:14:58.138116 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:58.138370 master-0 kubenswrapper[7689]: E0307 21:14:58.138244 7689 configmap.go:193] Couldn't get configMap openshift-controller-manager/client-ca: configmap "client-ca" not found Mar 07 21:14:58.138370 master-0 kubenswrapper[7689]: E0307 21:14:58.138312 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca podName:706c1d49-ef63-4383-8d46-c50f03aff6a3 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:14.138291753 +0000 UTC m=+47.690618655 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca") pod "controller-manager-57b874d6cb-w8kbv" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3") : configmap "client-ca" not found Mar 07 21:14:58.147759 master-0 kubenswrapper[7689]: I0307 21:14:58.147721 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"controller-manager-57b874d6cb-w8kbv\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:58.619407 master-0 kubenswrapper[7689]: I0307 21:14:58.618892 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" podStartSLOduration=4.618862578 podStartE2EDuration="4.618862578s" podCreationTimestamp="2026-03-07 21:14:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:58.617424041 +0000 UTC m=+32.169750973" watchObservedRunningTime="2026-03-07 21:14:58.618862578 +0000 UTC m=+32.171189510" Mar 07 21:14:59.021819 master-0 kubenswrapper[7689]: I0307 21:14:59.020093 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-1-master-0" podStartSLOduration=4.020062135 podStartE2EDuration="4.020062135s" podCreationTimestamp="2026-03-07 21:14:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:14:59.011347907 +0000 UTC m=+32.563674799" watchObservedRunningTime="2026-03-07 21:14:59.020062135 +0000 UTC m=+32.572389027" Mar 07 21:14:59.021819 master-0 kubenswrapper[7689]: I0307 21:14:59.021117 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57b874d6cb-w8kbv"] Mar 07 21:14:59.021819 master-0 kubenswrapper[7689]: E0307 21:14:59.021343 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" podUID="706c1d49-ef63-4383-8d46-c50f03aff6a3" Mar 07 21:14:59.087446 master-0 kubenswrapper[7689]: I0307 21:14:59.087374 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l"] Mar 07 21:14:59.087768 master-0 kubenswrapper[7689]: E0307 21:14:59.087736 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[client-ca serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" podUID="d1214cdf-12a8-41ad-a8b8-b11f34ce86bf" Mar 07 21:14:59.971956 master-0 kubenswrapper[7689]: I0307 21:14:59.971887 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-f877dfd9f-cnjsr"] Mar 07 21:14:59.972932 master-0 kubenswrapper[7689]: I0307 21:14:59.972898 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:14:59.973211 master-0 kubenswrapper[7689]: I0307 21:14:59.973149 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:14:59.973477 master-0 kubenswrapper[7689]: I0307 21:14:59.973428 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:14:59.975394 master-0 kubenswrapper[7689]: I0307 21:14:59.975346 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 21:14:59.977179 master-0 kubenswrapper[7689]: I0307 21:14:59.977119 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-0" Mar 07 21:14:59.978557 master-0 kubenswrapper[7689]: I0307 21:14:59.978521 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 21:14:59.980079 master-0 kubenswrapper[7689]: I0307 21:14:59.980049 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 21:14:59.980153 master-0 kubenswrapper[7689]: I0307 21:14:59.980069 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 21:14:59.980153 master-0 kubenswrapper[7689]: I0307 21:14:59.980080 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 21:14:59.980278 master-0 kubenswrapper[7689]: I0307 21:14:59.980211 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 21:14:59.982062 master-0 kubenswrapper[7689]: I0307 21:14:59.982029 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-0" Mar 07 21:14:59.984211 master-0 kubenswrapper[7689]: I0307 21:14:59.984188 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 21:14:59.996893 master-0 kubenswrapper[7689]: I0307 21:14:59.996794 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 21:15:00.011715 master-0 kubenswrapper[7689]: I0307 21:15:00.011235 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-f877dfd9f-cnjsr"] Mar 07 21:15:00.023898 master-0 kubenswrapper[7689]: I0307 21:15:00.023831 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:15:00.032397 master-0 kubenswrapper[7689]: I0307 21:15:00.032346 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:15:00.069407 master-0 kubenswrapper[7689]: I0307 21:15:00.069325 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069407 master-0 kubenswrapper[7689]: I0307 21:15:00.069405 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069540 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069565 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069643 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069667 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069753 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069779 master-0 kubenswrapper[7689]: I0307 21:15:00.069776 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f94jh\" (UniqueName: \"kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069952 master-0 kubenswrapper[7689]: I0307 21:15:00.069815 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.069952 master-0 kubenswrapper[7689]: I0307 21:15:00.069848 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.070010 master-0 kubenswrapper[7689]: I0307 21:15:00.069946 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.171643 master-0 kubenswrapper[7689]: I0307 21:15:00.171569 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config\") pod \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " Mar 07 21:15:00.171946 master-0 kubenswrapper[7689]: I0307 21:15:00.171653 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qflsf\" (UniqueName: \"kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf\") pod \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\" (UID: \"d1214cdf-12a8-41ad-a8b8-b11f34ce86bf\") " Mar 07 21:15:00.171946 master-0 kubenswrapper[7689]: I0307 21:15:00.171792 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles\") pod \"706c1d49-ef63-4383-8d46-c50f03aff6a3\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " Mar 07 21:15:00.171946 master-0 kubenswrapper[7689]: I0307 21:15:00.171903 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config\") pod \"706c1d49-ef63-4383-8d46-c50f03aff6a3\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " Mar 07 21:15:00.172063 master-0 kubenswrapper[7689]: I0307 21:15:00.171960 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") pod \"706c1d49-ef63-4383-8d46-c50f03aff6a3\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " Mar 07 21:15:00.172358 master-0 kubenswrapper[7689]: I0307 21:15:00.172238 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnxpn\" (UniqueName: \"kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn\") pod \"706c1d49-ef63-4383-8d46-c50f03aff6a3\" (UID: \"706c1d49-ef63-4383-8d46-c50f03aff6a3\") " Mar 07 21:15:00.172670 master-0 kubenswrapper[7689]: I0307 21:15:00.172627 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.172864 master-0 kubenswrapper[7689]: I0307 21:15:00.172805 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "706c1d49-ef63-4383-8d46-c50f03aff6a3" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:00.174055 master-0 kubenswrapper[7689]: I0307 21:15:00.173924 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config" (OuterVolumeSpecName: "config") pod "706c1d49-ef63-4383-8d46-c50f03aff6a3" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:00.174055 master-0 kubenswrapper[7689]: I0307 21:15:00.173994 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config" (OuterVolumeSpecName: "config") pod "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:00.174497 master-0 kubenswrapper[7689]: I0307 21:15:00.174331 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.174497 master-0 kubenswrapper[7689]: I0307 21:15:00.174413 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.174612 master-0 kubenswrapper[7689]: E0307 21:15:00.174537 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:00.174612 master-0 kubenswrapper[7689]: E0307 21:15:00.174585 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.674569326 +0000 UTC m=+34.226896218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : secret "serving-cert" not found Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.174902 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.174942 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175011 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175029 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f94jh\" (UniqueName: \"kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175086 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175122 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175139 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175180 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175206 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175288 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175300 7689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.175670 master-0 kubenswrapper[7689]: I0307 21:15:00.175312 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.176208 master-0 kubenswrapper[7689]: I0307 21:15:00.175748 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.176208 master-0 kubenswrapper[7689]: E0307 21:15:00.175818 7689 secret.go:189] Couldn't get secret openshift-apiserver/etcd-client: secret "etcd-client" not found Mar 07 21:15:00.176208 master-0 kubenswrapper[7689]: E0307 21:15:00.175922 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.67588531 +0000 UTC m=+34.228212402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etcd-client" (UniqueName: "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : secret "etcd-client" not found Mar 07 21:15:00.176208 master-0 kubenswrapper[7689]: I0307 21:15:00.176137 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.176208 master-0 kubenswrapper[7689]: I0307 21:15:00.176133 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.176422 master-0 kubenswrapper[7689]: E0307 21:15:00.176220 7689 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 07 21:15:00.176422 master-0 kubenswrapper[7689]: E0307 21:15:00.176287 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:00.676264389 +0000 UTC m=+34.228591281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : configmap "audit-0" not found Mar 07 21:15:00.176422 master-0 kubenswrapper[7689]: I0307 21:15:00.176347 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.176545 master-0 kubenswrapper[7689]: I0307 21:15:00.176487 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "706c1d49-ef63-4383-8d46-c50f03aff6a3" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:00.176593 master-0 kubenswrapper[7689]: I0307 21:15:00.176540 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.177369 master-0 kubenswrapper[7689]: I0307 21:15:00.177320 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf" (OuterVolumeSpecName: "kube-api-access-qflsf") pod "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf" (UID: "d1214cdf-12a8-41ad-a8b8-b11f34ce86bf"). InnerVolumeSpecName "kube-api-access-qflsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:00.177697 master-0 kubenswrapper[7689]: I0307 21:15:00.177638 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn" (OuterVolumeSpecName: "kube-api-access-vnxpn") pod "706c1d49-ef63-4383-8d46-c50f03aff6a3" (UID: "706c1d49-ef63-4383-8d46-c50f03aff6a3"). InnerVolumeSpecName "kube-api-access-vnxpn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:00.180038 master-0 kubenswrapper[7689]: I0307 21:15:00.179801 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.195162 master-0 kubenswrapper[7689]: I0307 21:15:00.194212 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f94jh\" (UniqueName: \"kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.276475 master-0 kubenswrapper[7689]: I0307 21:15:00.276310 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qflsf\" (UniqueName: \"kubernetes.io/projected/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-kube-api-access-qflsf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.276475 master-0 kubenswrapper[7689]: I0307 21:15:00.276363 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/706c1d49-ef63-4383-8d46-c50f03aff6a3-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.276475 master-0 kubenswrapper[7689]: I0307 21:15:00.276382 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnxpn\" (UniqueName: \"kubernetes.io/projected/706c1d49-ef63-4383-8d46-c50f03aff6a3-kube-api-access-vnxpn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:00.688871 master-0 kubenswrapper[7689]: I0307 21:15:00.688767 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.688871 master-0 kubenswrapper[7689]: I0307 21:15:00.688876 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.689298 master-0 kubenswrapper[7689]: I0307 21:15:00.688930 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.689298 master-0 kubenswrapper[7689]: E0307 21:15:00.689087 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:00.689298 master-0 kubenswrapper[7689]: E0307 21:15:00.689196 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:01.689162988 +0000 UTC m=+35.241489920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : secret "serving-cert" not found Mar 07 21:15:00.689667 master-0 kubenswrapper[7689]: E0307 21:15:00.689294 7689 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 07 21:15:00.689667 master-0 kubenswrapper[7689]: E0307 21:15:00.689575 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:01.689486326 +0000 UTC m=+35.241813258 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : configmap "audit-0" not found Mar 07 21:15:00.694297 master-0 kubenswrapper[7689]: I0307 21:15:00.694232 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:00.892445 master-0 kubenswrapper[7689]: I0307 21:15:00.892310 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:15:00.892804 master-0 kubenswrapper[7689]: I0307 21:15:00.892461 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:15:00.892804 master-0 kubenswrapper[7689]: I0307 21:15:00.892528 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:15:00.892804 master-0 kubenswrapper[7689]: I0307 21:15:00.892585 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:15:00.892804 master-0 kubenswrapper[7689]: E0307 21:15:00.892636 7689 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Mar 07 21:15:00.892804 master-0 kubenswrapper[7689]: E0307 21:15:00.892789 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs podName:dd310b71-6c79-4169-8b8a-7b3fe35a97fd nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.892751094 +0000 UTC m=+66.445078026 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs") pod "network-metrics-daemon-l2bdp" (UID: "dd310b71-6c79-4169-8b8a-7b3fe35a97fd") : secret "metrics-daemon-secret" not found Mar 07 21:15:00.893248 master-0 kubenswrapper[7689]: I0307 21:15:00.892879 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:15:00.893248 master-0 kubenswrapper[7689]: I0307 21:15:00.892942 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:00.893248 master-0 kubenswrapper[7689]: E0307 21:15:00.893140 7689 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Mar 07 21:15:00.893516 master-0 kubenswrapper[7689]: I0307 21:15:00.893230 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:15:00.893516 master-0 kubenswrapper[7689]: E0307 21:15:00.893309 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/catalog-operator-serving-cert: secret "catalog-operator-serving-cert" not found Mar 07 21:15:00.893516 master-0 kubenswrapper[7689]: E0307 21:15:00.893316 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls podName:a9d64cd1-bd5b-4fbc-972b-000a03c854fe nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.893263527 +0000 UTC m=+66.445590459 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-674cbfbd9d-czm5f" (UID: "a9d64cd1-bd5b-4fbc-972b-000a03c854fe") : secret "cluster-monitoring-operator-tls" not found Mar 07 21:15:00.893516 master-0 kubenswrapper[7689]: E0307 21:15:00.893444 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert podName:7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.893408161 +0000 UTC m=+66.445735093 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert") pod "catalog-operator-7d9c49f57b-j454x" (UID: "7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149") : secret "catalog-operator-serving-cert" not found Mar 07 21:15:00.893516 master-0 kubenswrapper[7689]: I0307 21:15:00.893498 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:15:00.893960 master-0 kubenswrapper[7689]: I0307 21:15:00.893589 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:15:00.893960 master-0 kubenswrapper[7689]: I0307 21:15:00.893643 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:15:00.894118 master-0 kubenswrapper[7689]: E0307 21:15:00.893983 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/package-server-manager-serving-cert: secret "package-server-manager-serving-cert" not found Mar 07 21:15:00.894220 master-0 kubenswrapper[7689]: I0307 21:15:00.894139 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:15:00.894320 master-0 kubenswrapper[7689]: E0307 21:15:00.894220 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert podName:e720291b-0f96-4ebb-80f2-5df7cb194ffc nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.894190761 +0000 UTC m=+66.446517703 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "package-server-manager-serving-cert" (UniqueName: "kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert") pod "package-server-manager-854648ff6d-kr9ft" (UID: "e720291b-0f96-4ebb-80f2-5df7cb194ffc") : secret "package-server-manager-serving-cert" not found Mar 07 21:15:00.894320 master-0 kubenswrapper[7689]: I0307 21:15:00.894273 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:00.894320 master-0 kubenswrapper[7689]: E0307 21:15:00.894311 7689 secret.go:189] Couldn't get secret openshift-multus/multus-admission-controller-secret: secret "multus-admission-controller-secret" not found Mar 07 21:15:00.894604 master-0 kubenswrapper[7689]: I0307 21:15:00.894357 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:15:00.894604 master-0 kubenswrapper[7689]: E0307 21:15:00.894390 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs podName:982319eb-2dc2-4faa-85d8-ee11840179fd nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.894363605 +0000 UTC m=+66.446690537 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs") pod "multus-admission-controller-8d675b596-mmqbs" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd") : secret "multus-admission-controller-secret" not found Mar 07 21:15:00.894604 master-0 kubenswrapper[7689]: E0307 21:15:00.894545 7689 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/olm-operator-serving-cert: secret "olm-operator-serving-cert" not found Mar 07 21:15:00.894945 master-0 kubenswrapper[7689]: E0307 21:15:00.894623 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert podName:69851821-e1fc-44a8-98df-0cfe9d564126 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.894600641 +0000 UTC m=+66.446927633 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "srv-cert" (UniqueName: "kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert") pod "olm-operator-d64cfc9db-qd6xh" (UID: "69851821-e1fc-44a8-98df-0cfe9d564126") : secret "olm-operator-serving-cert" not found Mar 07 21:15:00.894945 master-0 kubenswrapper[7689]: I0307 21:15:00.894509 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:15:00.894945 master-0 kubenswrapper[7689]: I0307 21:15:00.894771 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:00.895185 master-0 kubenswrapper[7689]: E0307 21:15:00.894972 7689 secret.go:189] Couldn't get secret openshift-marketplace/marketplace-operator-metrics: secret "marketplace-operator-metrics" not found Mar 07 21:15:00.895185 master-0 kubenswrapper[7689]: E0307 21:15:00.895051 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics podName:fc392945-53ad-473c-8803-70e2026712d2 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:32.895024153 +0000 UTC m=+66.447351105 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "marketplace-operator-metrics" (UniqueName: "kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics") pod "marketplace-operator-64bf9778cb-q7hrg" (UID: "fc392945-53ad-473c-8803-70e2026712d2") : secret "marketplace-operator-metrics" not found Mar 07 21:15:00.897938 master-0 kubenswrapper[7689]: I0307 21:15:00.897868 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:15:00.899671 master-0 kubenswrapper[7689]: I0307 21:15:00.899163 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:15:00.900548 master-0 kubenswrapper[7689]: I0307 21:15:00.900479 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:15:00.901723 master-0 kubenswrapper[7689]: I0307 21:15:00.901637 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"cluster-version-operator-745944c6b7-fjbl4\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:15:00.901867 master-0 kubenswrapper[7689]: I0307 21:15:00.901833 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:15:00.902196 master-0 kubenswrapper[7689]: I0307 21:15:00.902138 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:15:00.905785 master-0 kubenswrapper[7689]: I0307 21:15:00.905673 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:15:00.908013 master-0 kubenswrapper[7689]: I0307 21:15:00.907950 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:15:00.980866 master-0 kubenswrapper[7689]: I0307 21:15:00.980629 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l" Mar 07 21:15:00.981163 master-0 kubenswrapper[7689]: I0307 21:15:00.980836 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-57b874d6cb-w8kbv" Mar 07 21:15:01.031052 master-0 kubenswrapper[7689]: I0307 21:15:01.030956 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:01.032270 master-0 kubenswrapper[7689]: I0307 21:15:01.031880 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.041496 master-0 kubenswrapper[7689]: I0307 21:15:01.041276 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:15:01.042604 master-0 kubenswrapper[7689]: I0307 21:15:01.042533 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:15:01.042899 master-0 kubenswrapper[7689]: I0307 21:15:01.042838 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:15:01.043122 master-0 kubenswrapper[7689]: I0307 21:15:01.043061 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:15:01.043327 master-0 kubenswrapper[7689]: I0307 21:15:01.043257 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-57b874d6cb-w8kbv"] Mar 07 21:15:01.047734 master-0 kubenswrapper[7689]: I0307 21:15:01.045257 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:15:01.047950 master-0 kubenswrapper[7689]: I0307 21:15:01.047788 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-57b874d6cb-w8kbv"] Mar 07 21:15:01.051811 master-0 kubenswrapper[7689]: I0307 21:15:01.051144 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:01.052198 master-0 kubenswrapper[7689]: I0307 21:15:01.052011 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:15:01.083400 master-0 kubenswrapper[7689]: I0307 21:15:01.083304 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l"] Mar 07 21:15:01.088155 master-0 kubenswrapper[7689]: I0307 21:15:01.088074 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-dbd867658-rkw4l"] Mar 07 21:15:01.114784 master-0 kubenswrapper[7689]: I0307 21:15:01.103439 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:15:01.114784 master-0 kubenswrapper[7689]: I0307 21:15:01.104601 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:15:01.114784 master-0 kubenswrapper[7689]: I0307 21:15:01.104960 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:15:01.114784 master-0 kubenswrapper[7689]: I0307 21:15:01.105356 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:15:01.114784 master-0 kubenswrapper[7689]: I0307 21:15:01.106418 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:15:01.116250 master-0 kubenswrapper[7689]: I0307 21:15:01.114819 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:15:01.170560 master-0 kubenswrapper[7689]: W0307 21:15:01.170479 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f4ab99a_1ea2_4bf4_a987_5b6edadedc6b.slice/crio-67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2 WatchSource:0}: Error finding container 67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2: Status 404 returned error can't find the container with id 67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2 Mar 07 21:15:01.199606 master-0 kubenswrapper[7689]: I0307 21:15:01.199480 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xvqqg\" (UniqueName: \"kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.199903 master-0 kubenswrapper[7689]: I0307 21:15:01.199628 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.199903 master-0 kubenswrapper[7689]: I0307 21:15:01.199661 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.199903 master-0 kubenswrapper[7689]: I0307 21:15:01.199706 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.199903 master-0 kubenswrapper[7689]: I0307 21:15:01.199762 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.199903 master-0 kubenswrapper[7689]: I0307 21:15:01.199839 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/706c1d49-ef63-4383-8d46-c50f03aff6a3-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:01.200364 master-0 kubenswrapper[7689]: I0307 21:15:01.199931 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:01.200364 master-0 kubenswrapper[7689]: I0307 21:15:01.199981 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:01.302310 master-0 kubenswrapper[7689]: I0307 21:15:01.301220 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xvqqg\" (UniqueName: \"kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.302310 master-0 kubenswrapper[7689]: I0307 21:15:01.301572 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.302310 master-0 kubenswrapper[7689]: I0307 21:15:01.301596 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.302310 master-0 kubenswrapper[7689]: I0307 21:15:01.301613 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.302310 master-0 kubenswrapper[7689]: I0307 21:15:01.301630 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.313117 master-0 kubenswrapper[7689]: I0307 21:15:01.310081 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.336969 master-0 kubenswrapper[7689]: I0307 21:15:01.320640 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xvqqg\" (UniqueName: \"kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.336969 master-0 kubenswrapper[7689]: I0307 21:15:01.322550 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.336969 master-0 kubenswrapper[7689]: I0307 21:15:01.327486 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.352742 master-0 kubenswrapper[7689]: I0307 21:15:01.338874 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca\") pod \"controller-manager-f49f8b76c-p7dfh\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.383135 master-0 kubenswrapper[7689]: I0307 21:15:01.383037 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:01.463658 master-0 kubenswrapper[7689]: I0307 21:15:01.463478 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh"] Mar 07 21:15:01.475175 master-0 kubenswrapper[7689]: I0307 21:15:01.467336 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q"] Mar 07 21:15:01.592275 master-0 kubenswrapper[7689]: I0307 21:15:01.592211 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-589895fbb7-wqqqr"] Mar 07 21:15:01.596658 master-0 kubenswrapper[7689]: I0307 21:15:01.596618 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:01.604865 master-0 kubenswrapper[7689]: I0307 21:15:01.604811 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr"] Mar 07 21:15:01.604865 master-0 kubenswrapper[7689]: I0307 21:15:01.604869 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-677db989d6-tklw9"] Mar 07 21:15:01.609404 master-0 kubenswrapper[7689]: W0307 21:15:01.609369 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod61a9fce6_50e1_413c_9ec0_177d6e903bdd.slice/crio-54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697 WatchSource:0}: Error finding container 54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697: Status 404 returned error can't find the container with id 54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697 Mar 07 21:15:01.612952 master-0 kubenswrapper[7689]: W0307 21:15:01.612896 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda61a736a_66e5_4ca1_a8a7_088cf73cfcce.slice/crio-449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba WatchSource:0}: Error finding container 449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba: Status 404 returned error can't find the container with id 449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba Mar 07 21:15:01.620249 master-0 kubenswrapper[7689]: W0307 21:15:01.620179 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d15cce0_2fc4_44ad_afac_038a93e34ae9.slice/crio-5b9dc197367ff3c95b79be8d47728e9c0e3edf2f81deec1d18569373f08b6bf4 WatchSource:0}: Error finding container 5b9dc197367ff3c95b79be8d47728e9c0e3edf2f81deec1d18569373f08b6bf4: Status 404 returned error can't find the container with id 5b9dc197367ff3c95b79be8d47728e9c0e3edf2f81deec1d18569373f08b6bf4 Mar 07 21:15:01.621954 master-0 kubenswrapper[7689]: W0307 21:15:01.621910 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod47ecf172_666e_4360_97ff_bd9dbccc1fd6.slice/crio-6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942 WatchSource:0}: Error finding container 6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942: Status 404 returned error can't find the container with id 6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942 Mar 07 21:15:01.707496 master-0 kubenswrapper[7689]: I0307 21:15:01.707418 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:01.707496 master-0 kubenswrapper[7689]: I0307 21:15:01.707478 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:01.707940 master-0 kubenswrapper[7689]: E0307 21:15:01.707583 7689 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 07 21:15:01.707940 master-0 kubenswrapper[7689]: E0307 21:15:01.707621 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:01.707940 master-0 kubenswrapper[7689]: E0307 21:15:01.707651 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:03.707625932 +0000 UTC m=+37.259952824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : configmap "audit-0" not found Mar 07 21:15:01.707940 master-0 kubenswrapper[7689]: E0307 21:15:01.707675 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:03.707659623 +0000 UTC m=+37.259986535 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : secret "serving-cert" not found Mar 07 21:15:01.988722 master-0 kubenswrapper[7689]: I0307 21:15:01.988622 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" event={"ID":"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2","Type":"ContainerStarted","Data":"970d4806b55e4555ffff42e4b3c89ee95e0a6b585519742e791fd49bb6cf6a08"} Mar 07 21:15:01.990039 master-0 kubenswrapper[7689]: I0307 21:15:01.989988 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" event={"ID":"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b","Type":"ContainerStarted","Data":"67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2"} Mar 07 21:15:01.991844 master-0 kubenswrapper[7689]: I0307 21:15:01.991813 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerStarted","Data":"449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba"} Mar 07 21:15:01.993015 master-0 kubenswrapper[7689]: I0307 21:15:01.992955 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" event={"ID":"f8c93e0d-54e5-4c80-9d69-a70317baeacf","Type":"ContainerStarted","Data":"fe67bfc50554c3c039f940d887faf411984b747c8be2377d1eb15383b70de1a2"} Mar 07 21:15:01.994409 master-0 kubenswrapper[7689]: I0307 21:15:01.994365 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" event={"ID":"61a9fce6-50e1-413c-9ec0-177d6e903bdd","Type":"ContainerStarted","Data":"54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697"} Mar 07 21:15:01.995907 master-0 kubenswrapper[7689]: I0307 21:15:01.995841 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" event={"ID":"47ecf172-666e-4360-97ff-bd9dbccc1fd6","Type":"ContainerStarted","Data":"6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942"} Mar 07 21:15:01.997212 master-0 kubenswrapper[7689]: I0307 21:15:01.997164 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" event={"ID":"2d15cce0-2fc4-44ad-afac-038a93e34ae9","Type":"ContainerStarted","Data":"5b9dc197367ff3c95b79be8d47728e9c0e3edf2f81deec1d18569373f08b6bf4"} Mar 07 21:15:02.692234 master-0 kubenswrapper[7689]: I0307 21:15:02.692173 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="706c1d49-ef63-4383-8d46-c50f03aff6a3" path="/var/lib/kubelet/pods/706c1d49-ef63-4383-8d46-c50f03aff6a3/volumes" Mar 07 21:15:02.693038 master-0 kubenswrapper[7689]: I0307 21:15:02.692523 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1214cdf-12a8-41ad-a8b8-b11f34ce86bf" path="/var/lib/kubelet/pods/d1214cdf-12a8-41ad-a8b8-b11f34ce86bf/volumes" Mar 07 21:15:03.313074 master-0 kubenswrapper[7689]: I0307 21:15:03.312999 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:03.314159 master-0 kubenswrapper[7689]: I0307 21:15:03.313907 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.316271 master-0 kubenswrapper[7689]: I0307 21:15:03.316220 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:15:03.317100 master-0 kubenswrapper[7689]: I0307 21:15:03.317047 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:15:03.317414 master-0 kubenswrapper[7689]: I0307 21:15:03.317374 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:15:03.317662 master-0 kubenswrapper[7689]: I0307 21:15:03.317623 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:15:03.318042 master-0 kubenswrapper[7689]: I0307 21:15:03.318003 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:15:03.324307 master-0 kubenswrapper[7689]: I0307 21:15:03.324273 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:03.450088 master-0 kubenswrapper[7689]: I0307 21:15:03.450033 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pjqr\" (UniqueName: \"kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.450329 master-0 kubenswrapper[7689]: I0307 21:15:03.450193 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.450602 master-0 kubenswrapper[7689]: I0307 21:15:03.450430 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.451066 master-0 kubenswrapper[7689]: I0307 21:15:03.451031 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.552408 master-0 kubenswrapper[7689]: I0307 21:15:03.552341 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.552408 master-0 kubenswrapper[7689]: I0307 21:15:03.552410 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9pjqr\" (UniqueName: \"kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.552654 master-0 kubenswrapper[7689]: I0307 21:15:03.552492 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.552654 master-0 kubenswrapper[7689]: I0307 21:15:03.552558 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.553557 master-0 kubenswrapper[7689]: I0307 21:15:03.553530 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.553664 master-0 kubenswrapper[7689]: E0307 21:15:03.553639 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:15:03.553735 master-0 kubenswrapper[7689]: E0307 21:15:03.553710 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert podName:7e890559-2ff3-40aa-96ef-eeb997030eb6 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:04.053695608 +0000 UTC m=+37.606022500 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert") pod "route-controller-manager-7c8cdf56b5-h464s" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6") : secret "serving-cert" not found Mar 07 21:15:03.554549 master-0 kubenswrapper[7689]: I0307 21:15:03.554521 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.590715 master-0 kubenswrapper[7689]: I0307 21:15:03.590456 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pjqr\" (UniqueName: \"kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: I0307 21:15:03.754881 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: I0307 21:15:03.754935 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") pod \"apiserver-f877dfd9f-cnjsr\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: E0307 21:15:03.755090 7689 configmap.go:193] Couldn't get configMap openshift-apiserver/audit-0: configmap "audit-0" not found Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: E0307 21:15:03.755147 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:07.755129647 +0000 UTC m=+41.307456529 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "audit" (UniqueName: "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : configmap "audit-0" not found Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: E0307 21:15:03.755524 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:03.755577 master-0 kubenswrapper[7689]: E0307 21:15:03.755549 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert podName:07561664-5165-4c32-b34b-329a56a6a849 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:07.755540248 +0000 UTC m=+41.307867140 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert") pod "apiserver-f877dfd9f-cnjsr" (UID: "07561664-5165-4c32-b34b-329a56a6a849") : secret "serving-cert" not found Mar 07 21:15:03.774210 master-0 kubenswrapper[7689]: I0307 21:15:03.774133 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-f877dfd9f-cnjsr"] Mar 07 21:15:03.774597 master-0 kubenswrapper[7689]: E0307 21:15:03.774552 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[audit serving-cert], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" podUID="07561664-5165-4c32-b34b-329a56a6a849" Mar 07 21:15:04.008206 master-0 kubenswrapper[7689]: I0307 21:15:04.007493 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:04.021371 master-0 kubenswrapper[7689]: I0307 21:15:04.021322 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:04.059665 master-0 kubenswrapper[7689]: I0307 21:15:04.059552 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:04.059988 master-0 kubenswrapper[7689]: E0307 21:15:04.059963 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:15:04.060080 master-0 kubenswrapper[7689]: E0307 21:15:04.060037 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert podName:7e890559-2ff3-40aa-96ef-eeb997030eb6 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:05.060018034 +0000 UTC m=+38.612344926 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert") pod "route-controller-manager-7c8cdf56b5-h464s" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6") : secret "serving-cert" not found Mar 07 21:15:04.163282 master-0 kubenswrapper[7689]: I0307 21:15:04.163209 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163282 master-0 kubenswrapper[7689]: I0307 21:15:04.163279 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f94jh\" (UniqueName: \"kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163319 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163348 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163422 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163464 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163498 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163524 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163623 master-0 kubenswrapper[7689]: I0307 21:15:04.163601 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle\") pod \"07561664-5165-4c32-b34b-329a56a6a849\" (UID: \"07561664-5165-4c32-b34b-329a56a6a849\") " Mar 07 21:15:04.163947 master-0 kubenswrapper[7689]: I0307 21:15:04.163623 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:04.164757 master-0 kubenswrapper[7689]: I0307 21:15:04.164171 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets" (OuterVolumeSpecName: "node-pullsecrets") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "node-pullsecrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:04.164757 master-0 kubenswrapper[7689]: I0307 21:15:04.164265 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:04.164757 master-0 kubenswrapper[7689]: I0307 21:15:04.164364 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:04.164757 master-0 kubenswrapper[7689]: I0307 21:15:04.164376 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config" (OuterVolumeSpecName: "config") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:04.164757 master-0 kubenswrapper[7689]: I0307 21:15:04.164553 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165272 7689 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165312 7689 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-etcd-serving-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165330 7689 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165349 7689 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-image-import-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165368 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.165418 master-0 kubenswrapper[7689]: I0307 21:15:04.165388 7689 reconciler_common.go:293] "Volume detached for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/07561664-5165-4c32-b34b-329a56a6a849-node-pullsecrets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.168709 master-0 kubenswrapper[7689]: I0307 21:15:04.168260 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh" (OuterVolumeSpecName: "kube-api-access-f94jh") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "kube-api-access-f94jh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:04.168709 master-0 kubenswrapper[7689]: I0307 21:15:04.168319 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:04.171085 master-0 kubenswrapper[7689]: I0307 21:15:04.171041 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "07561664-5165-4c32-b34b-329a56a6a849" (UID: "07561664-5165-4c32-b34b-329a56a6a849"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:04.269467 master-0 kubenswrapper[7689]: I0307 21:15:04.269214 7689 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-encryption-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.269467 master-0 kubenswrapper[7689]: I0307 21:15:04.269286 7689 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-etcd-client\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.269467 master-0 kubenswrapper[7689]: I0307 21:15:04.269306 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f94jh\" (UniqueName: \"kubernetes.io/projected/07561664-5165-4c32-b34b-329a56a6a849-kube-api-access-f94jh\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:04.848860 master-0 kubenswrapper[7689]: I0307 21:15:04.848765 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:15:04.942121 master-0 kubenswrapper[7689]: I0307 21:15:04.942049 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:15:05.011584 master-0 kubenswrapper[7689]: I0307 21:15:05.011369 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-f877dfd9f-cnjsr" Mar 07 21:15:05.052789 master-0 kubenswrapper[7689]: I0307 21:15:05.052666 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-694d775589-btnh4"] Mar 07 21:15:05.054416 master-0 kubenswrapper[7689]: I0307 21:15:05.054363 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.056727 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.057219 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.057429 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-apiserver/apiserver-f877dfd9f-cnjsr"] Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.057502 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.058330 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.058640 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.058806 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.059808 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.059912 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 21:15:05.063555 master-0 kubenswrapper[7689]: I0307 21:15:05.060351 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 21:15:05.064574 master-0 kubenswrapper[7689]: I0307 21:15:05.064350 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-apiserver/apiserver-f877dfd9f-cnjsr"] Mar 07 21:15:05.070853 master-0 kubenswrapper[7689]: I0307 21:15:05.070792 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-694d775589-btnh4"] Mar 07 21:15:05.080172 master-0 kubenswrapper[7689]: I0307 21:15:05.080106 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 21:15:05.081385 master-0 kubenswrapper[7689]: I0307 21:15:05.080966 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:05.081385 master-0 kubenswrapper[7689]: E0307 21:15:05.081199 7689 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: secret "serving-cert" not found Mar 07 21:15:05.081385 master-0 kubenswrapper[7689]: E0307 21:15:05.081266 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert podName:7e890559-2ff3-40aa-96ef-eeb997030eb6 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:07.081244821 +0000 UTC m=+40.633571723 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert") pod "route-controller-manager-7c8cdf56b5-h464s" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6") : secret "serving-cert" not found Mar 07 21:15:05.186031 master-0 kubenswrapper[7689]: I0307 21:15:05.185936 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186031 master-0 kubenswrapper[7689]: I0307 21:15:05.186015 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186423 master-0 kubenswrapper[7689]: I0307 21:15:05.186085 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186423 master-0 kubenswrapper[7689]: I0307 21:15:05.186111 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186423 master-0 kubenswrapper[7689]: I0307 21:15:05.186134 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186423 master-0 kubenswrapper[7689]: I0307 21:15:05.186325 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186423 master-0 kubenswrapper[7689]: I0307 21:15:05.186395 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186592 master-0 kubenswrapper[7689]: I0307 21:15:05.186472 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186592 master-0 kubenswrapper[7689]: I0307 21:15:05.186523 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69jxd\" (UniqueName: \"kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186592 master-0 kubenswrapper[7689]: I0307 21:15:05.186557 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186592 master-0 kubenswrapper[7689]: I0307 21:15:05.186576 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.186742 master-0 kubenswrapper[7689]: I0307 21:15:05.186697 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/07561664-5165-4c32-b34b-329a56a6a849-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:05.186742 master-0 kubenswrapper[7689]: I0307 21:15:05.186713 7689 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/07561664-5165-4c32-b34b-329a56a6a849-audit\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:05.288141 master-0 kubenswrapper[7689]: I0307 21:15:05.287911 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jxd\" (UniqueName: \"kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288141 master-0 kubenswrapper[7689]: I0307 21:15:05.288079 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288318 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288406 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288434 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288507 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288528 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288546 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288607 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288630 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.288753 master-0 kubenswrapper[7689]: I0307 21:15:05.288698 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.288909 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.289654 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.290315 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.290606 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: E0307 21:15:05.290668 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: E0307 21:15:05.290879 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert podName:e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:05.790833983 +0000 UTC m=+39.343160915 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert") pod "apiserver-694d775589-btnh4" (UID: "e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9") : secret "serving-cert" not found Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.291423 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.291752 master-0 kubenswrapper[7689]: I0307 21:15:05.291455 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.292447 master-0 kubenswrapper[7689]: I0307 21:15:05.291842 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.295429 master-0 kubenswrapper[7689]: I0307 21:15:05.295328 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.297015 master-0 kubenswrapper[7689]: I0307 21:15:05.296970 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.319034 master-0 kubenswrapper[7689]: I0307 21:15:05.318660 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jxd\" (UniqueName: \"kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.796047 master-0 kubenswrapper[7689]: I0307 21:15:05.795966 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:05.796457 master-0 kubenswrapper[7689]: E0307 21:15:05.796377 7689 secret.go:189] Couldn't get secret openshift-apiserver/serving-cert: secret "serving-cert" not found Mar 07 21:15:05.796619 master-0 kubenswrapper[7689]: E0307 21:15:05.796583 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert podName:e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:06.796538954 +0000 UTC m=+40.348865996 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert") pod "apiserver-694d775589-btnh4" (UID: "e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9") : secret "serving-cert" not found Mar 07 21:15:06.424726 master-0 kubenswrapper[7689]: I0307 21:15:06.413856 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 07 21:15:06.424726 master-0 kubenswrapper[7689]: I0307 21:15:06.415270 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.424726 master-0 kubenswrapper[7689]: I0307 21:15:06.417490 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 07 21:15:06.435186 master-0 kubenswrapper[7689]: I0307 21:15:06.435098 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 07 21:15:06.537703 master-0 kubenswrapper[7689]: I0307 21:15:06.537621 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:15:06.538593 master-0 kubenswrapper[7689]: I0307 21:15:06.538536 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-1-master-0" podUID="2b95a709-faec-4d50-8742-935bddd84cbc" containerName="installer" containerID="cri-o://bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476" gracePeriod=30 Mar 07 21:15:06.612874 master-0 kubenswrapper[7689]: I0307 21:15:06.612767 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.613231 master-0 kubenswrapper[7689]: I0307 21:15:06.613018 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.613434 master-0 kubenswrapper[7689]: I0307 21:15:06.613397 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.690751 master-0 kubenswrapper[7689]: I0307 21:15:06.690497 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="07561664-5165-4c32-b34b-329a56a6a849" path="/var/lib/kubelet/pods/07561664-5165-4c32-b34b-329a56a6a849/volumes" Mar 07 21:15:06.715106 master-0 kubenswrapper[7689]: I0307 21:15:06.715055 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.715328 master-0 kubenswrapper[7689]: I0307 21:15:06.715156 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.715328 master-0 kubenswrapper[7689]: I0307 21:15:06.715247 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.715328 master-0 kubenswrapper[7689]: I0307 21:15:06.715281 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.715565 master-0 kubenswrapper[7689]: I0307 21:15:06.715515 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.738576 master-0 kubenswrapper[7689]: I0307 21:15:06.738508 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access\") pod \"installer-1-master-0\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.805241 master-0 kubenswrapper[7689]: I0307 21:15:06.805175 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:06.816505 master-0 kubenswrapper[7689]: I0307 21:15:06.816471 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:06.819874 master-0 kubenswrapper[7689]: I0307 21:15:06.819822 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:06.917644 master-0 kubenswrapper[7689]: I0307 21:15:06.917541 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:07.120459 master-0 kubenswrapper[7689]: I0307 21:15:07.120311 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:07.132643 master-0 kubenswrapper[7689]: I0307 21:15:07.132581 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"route-controller-manager-7c8cdf56b5-h464s\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:07.239653 master-0 kubenswrapper[7689]: I0307 21:15:07.239580 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:08.940387 master-0 kubenswrapper[7689]: I0307 21:15:08.939993 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:08.941252 master-0 kubenswrapper[7689]: I0307 21:15:08.941029 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:08.950940 master-0 kubenswrapper[7689]: I0307 21:15:08.950847 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:08.960858 master-0 kubenswrapper[7689]: I0307 21:15:08.960799 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:08.961026 master-0 kubenswrapper[7689]: I0307 21:15:08.960964 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:08.961026 master-0 kubenswrapper[7689]: I0307 21:15:08.961002 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.062078 master-0 kubenswrapper[7689]: I0307 21:15:09.061991 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.062363 master-0 kubenswrapper[7689]: I0307 21:15:09.062127 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.062363 master-0 kubenswrapper[7689]: I0307 21:15:09.062171 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.062363 master-0 kubenswrapper[7689]: I0307 21:15:09.062194 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.062526 master-0 kubenswrapper[7689]: I0307 21:15:09.062372 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.081222 master-0 kubenswrapper[7689]: I0307 21:15:09.081161 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access\") pod \"installer-2-master-0\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.290557 master-0 kubenswrapper[7689]: I0307 21:15:09.290384 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:09.596249 master-0 kubenswrapper[7689]: I0307 21:15:09.594717 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx"] Mar 07 21:15:09.596249 master-0 kubenswrapper[7689]: I0307 21:15:09.595900 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.599786 master-0 kubenswrapper[7689]: I0307 21:15:09.599670 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 21:15:09.601188 master-0 kubenswrapper[7689]: I0307 21:15:09.600128 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 21:15:09.601188 master-0 kubenswrapper[7689]: I0307 21:15:09.600363 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 21:15:09.601188 master-0 kubenswrapper[7689]: I0307 21:15:09.600624 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 21:15:09.602248 master-0 kubenswrapper[7689]: I0307 21:15:09.602228 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 21:15:09.602489 master-0 kubenswrapper[7689]: I0307 21:15:09.602470 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 21:15:09.602640 master-0 kubenswrapper[7689]: I0307 21:15:09.602624 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 21:15:09.603428 master-0 kubenswrapper[7689]: I0307 21:15:09.603408 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 21:15:09.615587 master-0 kubenswrapper[7689]: I0307 21:15:09.615304 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx"] Mar 07 21:15:09.673323 master-0 kubenswrapper[7689]: I0307 21:15:09.673180 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673323 master-0 kubenswrapper[7689]: I0307 21:15:09.673236 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673556 master-0 kubenswrapper[7689]: I0307 21:15:09.673339 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673556 master-0 kubenswrapper[7689]: I0307 21:15:09.673445 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673556 master-0 kubenswrapper[7689]: I0307 21:15:09.673519 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673641 master-0 kubenswrapper[7689]: I0307 21:15:09.673560 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673641 master-0 kubenswrapper[7689]: I0307 21:15:09.673590 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lbmm\" (UniqueName: \"kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.673641 master-0 kubenswrapper[7689]: I0307 21:15:09.673621 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.774543 master-0 kubenswrapper[7689]: I0307 21:15:09.774457 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.774543 master-0 kubenswrapper[7689]: I0307 21:15:09.774520 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774629 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774743 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774785 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774820 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774850 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbmm\" (UniqueName: \"kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.775178 master-0 kubenswrapper[7689]: I0307 21:15:09.774887 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.776025 master-0 kubenswrapper[7689]: I0307 21:15:09.775864 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.776025 master-0 kubenswrapper[7689]: I0307 21:15:09.775920 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.776320 master-0 kubenswrapper[7689]: I0307 21:15:09.776271 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.777385 master-0 kubenswrapper[7689]: I0307 21:15:09.776753 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.779475 master-0 kubenswrapper[7689]: I0307 21:15:09.779306 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.781328 master-0 kubenswrapper[7689]: I0307 21:15:09.781280 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.791555 master-0 kubenswrapper[7689]: I0307 21:15:09.791502 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.793802 master-0 kubenswrapper[7689]: I0307 21:15:09.793755 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbmm\" (UniqueName: \"kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:09.942202 master-0 kubenswrapper[7689]: I0307 21:15:09.942035 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:11.005317 master-0 kubenswrapper[7689]: I0307 21:15:10.998503 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:11.023964 master-0 kubenswrapper[7689]: I0307 21:15:11.020714 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-694d775589-btnh4"] Mar 07 21:15:11.060813 master-0 kubenswrapper[7689]: I0307 21:15:11.058930 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx"] Mar 07 21:15:11.066886 master-0 kubenswrapper[7689]: W0307 21:15:11.066319 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode8f9c2bb_0b0e_48e0_8728_f2a460dc69e9.slice/crio-7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1 WatchSource:0}: Error finding container 7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1: Status 404 returned error can't find the container with id 7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1 Mar 07 21:15:11.071563 master-0 kubenswrapper[7689]: I0307 21:15:11.070608 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:11.076272 master-0 kubenswrapper[7689]: I0307 21:15:11.072801 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:15:11.106034 master-0 kubenswrapper[7689]: I0307 21:15:11.087516 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-1-master-0"] Mar 07 21:15:11.106034 master-0 kubenswrapper[7689]: I0307 21:15:11.101045 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" event={"ID":"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2","Type":"ContainerStarted","Data":"3d9119a026d90b8ec2d78d2795489aa4c35f51d54ccaf8a6982c9cbfecf34cd0"} Mar 07 21:15:11.106034 master-0 kubenswrapper[7689]: I0307 21:15:11.104600 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" event={"ID":"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b","Type":"ContainerStarted","Data":"f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b"} Mar 07 21:15:11.132544 master-0 kubenswrapper[7689]: I0307 21:15:11.131018 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-694d775589-btnh4" event={"ID":"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9","Type":"ContainerStarted","Data":"7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1"} Mar 07 21:15:11.151055 master-0 kubenswrapper[7689]: I0307 21:15:11.150961 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerStarted","Data":"8ba80cee3d89d3d6b976aac0e2f007c4e112b08741be4d9e1220847381797dab"} Mar 07 21:15:11.195621 master-0 kubenswrapper[7689]: I0307 21:15:11.195555 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" event={"ID":"7e890559-2ff3-40aa-96ef-eeb997030eb6","Type":"ContainerStarted","Data":"467d1516c6f7433149a21563253a86c3e2063ff1808ade15bd19198e0649b603"} Mar 07 21:15:11.201764 master-0 kubenswrapper[7689]: I0307 21:15:11.200985 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:11.209812 master-0 kubenswrapper[7689]: I0307 21:15:11.208642 7689 patch_prober.go:28] interesting pod/controller-manager-f49f8b76c-p7dfh container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.38:8443/healthz\": dial tcp 10.128.0.38:8443: connect: connection refused" start-of-body= Mar 07 21:15:11.209812 master-0 kubenswrapper[7689]: I0307 21:15:11.208753 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.38:8443/healthz\": dial tcp 10.128.0.38:8443: connect: connection refused" Mar 07 21:15:11.240089 master-0 kubenswrapper[7689]: I0307 21:15:11.239995 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" podStartSLOduration=3.122593957 podStartE2EDuration="12.2399749s" podCreationTimestamp="2026-03-07 21:14:59 +0000 UTC" firstStartedPulling="2026-03-07 21:15:01.626839026 +0000 UTC m=+35.179165928" lastFinishedPulling="2026-03-07 21:15:10.744219939 +0000 UTC m=+44.296546871" observedRunningTime="2026-03-07 21:15:11.237267799 +0000 UTC m=+44.789594691" watchObservedRunningTime="2026-03-07 21:15:11.2399749 +0000 UTC m=+44.792301782" Mar 07 21:15:11.397120 master-0 kubenswrapper[7689]: I0307 21:15:11.397056 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:11.480932 master-0 kubenswrapper[7689]: I0307 21:15:11.479061 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-node-tuning-operator/tuned-qzjmv"] Mar 07 21:15:11.482661 master-0 kubenswrapper[7689]: I0307 21:15:11.482631 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515259 master-0 kubenswrapper[7689]: I0307 21:15:11.515193 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515259 master-0 kubenswrapper[7689]: I0307 21:15:11.515247 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515259 master-0 kubenswrapper[7689]: I0307 21:15:11.515268 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515286 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515316 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515331 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515350 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515390 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515419 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515447 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515466 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515488 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515504 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-87fml\" (UniqueName: \"kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.515586 master-0 kubenswrapper[7689]: I0307 21:15:11.515525 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616157 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616213 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616240 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616270 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616292 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87fml\" (UniqueName: \"kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616319 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616503 master-0 kubenswrapper[7689]: I0307 21:15:11.616375 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616502 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616528 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616614 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616646 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616698 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616715 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616739 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616740 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.616968 master-0 kubenswrapper[7689]: I0307 21:15:11.616802 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.617475 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.617776 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.617814 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.617921 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.618125 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.618163 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.618197 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618227 master-0 kubenswrapper[7689]: I0307 21:15:11.618236 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.618733 master-0 kubenswrapper[7689]: I0307 21:15:11.618608 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.626476 master-0 kubenswrapper[7689]: I0307 21:15:11.626423 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.627913 master-0 kubenswrapper[7689]: I0307 21:15:11.627877 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.641776 master-0 kubenswrapper[7689]: I0307 21:15:11.641738 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87fml\" (UniqueName: \"kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.883971 master-0 kubenswrapper[7689]: I0307 21:15:11.883911 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:15:11.904967 master-0 kubenswrapper[7689]: I0307 21:15:11.904484 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-hm77f"] Mar 07 21:15:11.905396 master-0 kubenswrapper[7689]: I0307 21:15:11.905358 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:11.910044 master-0 kubenswrapper[7689]: I0307 21:15:11.910002 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 21:15:11.910291 master-0 kubenswrapper[7689]: I0307 21:15:11.910263 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 21:15:11.910408 master-0 kubenswrapper[7689]: I0307 21:15:11.910381 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 21:15:11.921702 master-0 kubenswrapper[7689]: I0307 21:15:11.911845 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 21:15:11.921702 master-0 kubenswrapper[7689]: I0307 21:15:11.913284 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hm77f"] Mar 07 21:15:11.943711 master-0 kubenswrapper[7689]: I0307 21:15:11.931659 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:11.943711 master-0 kubenswrapper[7689]: I0307 21:15:11.931812 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:11.943711 master-0 kubenswrapper[7689]: I0307 21:15:11.931859 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmp5q\" (UniqueName: \"kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.034455 master-0 kubenswrapper[7689]: I0307 21:15:12.033468 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmp5q\" (UniqueName: \"kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.034455 master-0 kubenswrapper[7689]: I0307 21:15:12.034389 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.035769 master-0 kubenswrapper[7689]: I0307 21:15:12.035630 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.035769 master-0 kubenswrapper[7689]: I0307 21:15:12.035716 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.035869 master-0 kubenswrapper[7689]: E0307 21:15:12.035817 7689 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Mar 07 21:15:12.035919 master-0 kubenswrapper[7689]: E0307 21:15:12.035898 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls podName:4e94f64e-4a89-4d9d-acbd-80f86bf2f964 nodeName:}" failed. No retries permitted until 2026-03-07 21:15:12.535876154 +0000 UTC m=+46.088203046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls") pod "dns-default-hm77f" (UID: "4e94f64e-4a89-4d9d-acbd-80f86bf2f964") : secret "dns-default-metrics-tls" not found Mar 07 21:15:12.056902 master-0 kubenswrapper[7689]: I0307 21:15:12.056646 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmp5q\" (UniqueName: \"kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.212633 master-0 kubenswrapper[7689]: I0307 21:15:12.212581 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" event={"ID":"61a9fce6-50e1-413c-9ec0-177d6e903bdd","Type":"ContainerStarted","Data":"bd22dbe70b8074c2cd68f86581adec32ca0a5a2865a115b22702f5f980003ed1"} Mar 07 21:15:12.212633 master-0 kubenswrapper[7689]: I0307 21:15:12.212636 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" event={"ID":"61a9fce6-50e1-413c-9ec0-177d6e903bdd","Type":"ContainerStarted","Data":"22615248a555a9952d7795396bbbf575d26328617f26a88aacae00fb2dfcb7e9"} Mar 07 21:15:12.215122 master-0 kubenswrapper[7689]: I0307 21:15:12.215070 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" event={"ID":"47ecf172-666e-4360-97ff-bd9dbccc1fd6","Type":"ContainerStarted","Data":"d9c9700ef3cdaba6833e00d44e39806385f696f37ff17a4df92695c36e563c13"} Mar 07 21:15:12.215122 master-0 kubenswrapper[7689]: I0307 21:15:12.215104 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" event={"ID":"47ecf172-666e-4360-97ff-bd9dbccc1fd6","Type":"ContainerStarted","Data":"fd3068512853f47cb6ff3c220f0c55906abeef936981e6e1d0a5a3b2bfe8986d"} Mar 07 21:15:12.221315 master-0 kubenswrapper[7689]: I0307 21:15:12.221273 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" event={"ID":"2d15cce0-2fc4-44ad-afac-038a93e34ae9","Type":"ContainerStarted","Data":"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5"} Mar 07 21:15:12.227288 master-0 kubenswrapper[7689]: I0307 21:15:12.227257 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"16204ee5-568e-453c-90c0-8966cb2b9d88","Type":"ContainerStarted","Data":"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f"} Mar 07 21:15:12.227288 master-0 kubenswrapper[7689]: I0307 21:15:12.227288 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"16204ee5-568e-453c-90c0-8966cb2b9d88","Type":"ContainerStarted","Data":"361657c5468b73dbe4d6bbc10059a3a8e6d93b0d11c54b62883ab0c86559f65a"} Mar 07 21:15:12.241947 master-0 kubenswrapper[7689]: I0307 21:15:12.241877 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerStarted","Data":"159c79f3cd3b61112723c52943d2b3baaff5bf5ea82c6aeef718e40806b1dd65"} Mar 07 21:15:12.244404 master-0 kubenswrapper[7689]: I0307 21:15:12.244342 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"e757a93e-91aa-4fce-949b-4c51a060528e","Type":"ContainerStarted","Data":"a049a3a4077135fa9e02b1a9804eac864bd6874b0847dc250b8650ce1e94ce1d"} Mar 07 21:15:12.244479 master-0 kubenswrapper[7689]: I0307 21:15:12.244420 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"e757a93e-91aa-4fce-949b-4c51a060528e","Type":"ContainerStarted","Data":"34c367e3b7cd662a238cd3cf60724c5f41e1100b6bc750255dda8f40be5bf92e"} Mar 07 21:15:12.259152 master-0 kubenswrapper[7689]: I0307 21:15:12.255577 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" event={"ID":"7d462ed3-d191-42a5-b8e0-79ab9af13991","Type":"ContainerStarted","Data":"c098327f700751fe6a38c107559ad8b2a80af9c9060aa16b67a2b7a48e44faad"} Mar 07 21:15:12.261075 master-0 kubenswrapper[7689]: I0307 21:15:12.260667 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" event={"ID":"f8c93e0d-54e5-4c80-9d69-a70317baeacf","Type":"ContainerStarted","Data":"ce277045c24d296bd4d74241d9987bda75f36597494e88ec64b1272f784cd2cf"} Mar 07 21:15:12.267929 master-0 kubenswrapper[7689]: I0307 21:15:12.267771 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" event={"ID":"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05","Type":"ContainerStarted","Data":"8d028da554ae04959651a45a556a7a26ee7f2f273af3aeb7e3e2769bfb5c68e8"} Mar 07 21:15:12.281211 master-0 kubenswrapper[7689]: I0307 21:15:12.280908 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-2-master-0" podStartSLOduration=4.280829199 podStartE2EDuration="4.280829199s" podCreationTimestamp="2026-03-07 21:15:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:12.276787844 +0000 UTC m=+45.829114746" watchObservedRunningTime="2026-03-07 21:15:12.280829199 +0000 UTC m=+45.833156101" Mar 07 21:15:12.355892 master-0 kubenswrapper[7689]: I0307 21:15:12.355667 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" podStartSLOduration=1.355646608 podStartE2EDuration="1.355646608s" podCreationTimestamp="2026-03-07 21:15:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:12.352191079 +0000 UTC m=+45.904517981" watchObservedRunningTime="2026-03-07 21:15:12.355646608 +0000 UTC m=+45.907973500" Mar 07 21:15:12.366619 master-0 kubenswrapper[7689]: I0307 21:15:12.366571 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-zhkfm"] Mar 07 21:15:12.367414 master-0 kubenswrapper[7689]: I0307 21:15:12.367393 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.387545 master-0 kubenswrapper[7689]: I0307 21:15:12.387473 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-1-master-0" podStartSLOduration=6.387453908 podStartE2EDuration="6.387453908s" podCreationTimestamp="2026-03-07 21:15:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:12.387324374 +0000 UTC m=+45.939651266" watchObservedRunningTime="2026-03-07 21:15:12.387453908 +0000 UTC m=+45.939780810" Mar 07 21:15:12.450591 master-0 kubenswrapper[7689]: I0307 21:15:12.450424 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.450591 master-0 kubenswrapper[7689]: I0307 21:15:12.450553 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7nz\" (UniqueName: \"kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.552216 master-0 kubenswrapper[7689]: I0307 21:15:12.552091 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.552389 master-0 kubenswrapper[7689]: I0307 21:15:12.552248 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.552455 master-0 kubenswrapper[7689]: I0307 21:15:12.552424 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.552539 master-0 kubenswrapper[7689]: I0307 21:15:12.552490 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7nz\" (UniqueName: \"kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.570712 master-0 kubenswrapper[7689]: I0307 21:15:12.566638 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.570712 master-0 kubenswrapper[7689]: I0307 21:15:12.569709 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:12.593586 master-0 kubenswrapper[7689]: I0307 21:15:12.593532 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7nz\" (UniqueName: \"kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.702184 master-0 kubenswrapper[7689]: I0307 21:15:12.702113 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:15:12.742274 master-0 kubenswrapper[7689]: W0307 21:15:12.742014 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2ca65f5_7dbe_4407_b38e_713592f62136.slice/crio-4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f WatchSource:0}: Error finding container 4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f: Status 404 returned error can't find the container with id 4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f Mar 07 21:15:12.857607 master-0 kubenswrapper[7689]: I0307 21:15:12.857534 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-hm77f"] Mar 07 21:15:12.873727 master-0 kubenswrapper[7689]: W0307 21:15:12.873651 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4e94f64e_4a89_4d9d_acbd_80f86bf2f964.slice/crio-9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689 WatchSource:0}: Error finding container 9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689: Status 404 returned error can't find the container with id 9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689 Mar 07 21:15:13.273947 master-0 kubenswrapper[7689]: I0307 21:15:13.272965 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" event={"ID":"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05","Type":"ContainerStarted","Data":"7bd91a5796117967d146d1b5b6c98971b82bbe9bf6a45f24bdc8ec0e419872d2"} Mar 07 21:15:13.276034 master-0 kubenswrapper[7689]: I0307 21:15:13.275958 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhkfm" event={"ID":"f2ca65f5-7dbe-4407-b38e-713592f62136","Type":"ContainerStarted","Data":"f132a0d60a70fefbd2e825d54ff740706121baeba377055de1e18854628a1d13"} Mar 07 21:15:13.276100 master-0 kubenswrapper[7689]: I0307 21:15:13.276068 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-zhkfm" event={"ID":"f2ca65f5-7dbe-4407-b38e-713592f62136","Type":"ContainerStarted","Data":"4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f"} Mar 07 21:15:13.277690 master-0 kubenswrapper[7689]: I0307 21:15:13.277634 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hm77f" event={"ID":"4e94f64e-4a89-4d9d-acbd-80f86bf2f964","Type":"ContainerStarted","Data":"9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689"} Mar 07 21:15:13.291187 master-0 kubenswrapper[7689]: I0307 21:15:13.291065 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-zhkfm" podStartSLOduration=1.291041418 podStartE2EDuration="1.291041418s" podCreationTimestamp="2026-03-07 21:15:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:13.290128554 +0000 UTC m=+46.842455456" watchObservedRunningTime="2026-03-07 21:15:13.291041418 +0000 UTC m=+46.843368350" Mar 07 21:15:15.036112 master-0 kubenswrapper[7689]: I0307 21:15:15.035171 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:15:15.038630 master-0 kubenswrapper[7689]: I0307 21:15:15.038596 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.041153 master-0 kubenswrapper[7689]: I0307 21:15:15.041107 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 21:15:15.059137 master-0 kubenswrapper[7689]: I0307 21:15:15.059090 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:15:15.105455 master-0 kubenswrapper[7689]: I0307 21:15:15.105379 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.105643 master-0 kubenswrapper[7689]: I0307 21:15:15.105532 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.105643 master-0 kubenswrapper[7689]: I0307 21:15:15.105614 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.207093 master-0 kubenswrapper[7689]: I0307 21:15:15.207022 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.207441 master-0 kubenswrapper[7689]: I0307 21:15:15.207153 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.207441 master-0 kubenswrapper[7689]: I0307 21:15:15.207197 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.207441 master-0 kubenswrapper[7689]: I0307 21:15:15.207312 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.207441 master-0 kubenswrapper[7689]: I0307 21:15:15.207360 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.229586 master-0 kubenswrapper[7689]: I0307 21:15:15.229511 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access\") pod \"installer-1-master-0\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.301083 master-0 kubenswrapper[7689]: I0307 21:15:15.299544 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" event={"ID":"7e890559-2ff3-40aa-96ef-eeb997030eb6","Type":"ContainerStarted","Data":"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52"} Mar 07 21:15:15.301083 master-0 kubenswrapper[7689]: I0307 21:15:15.300046 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:15.318053 master-0 kubenswrapper[7689]: I0307 21:15:15.317950 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" podStartSLOduration=13.341154531 podStartE2EDuration="16.317930417s" podCreationTimestamp="2026-03-07 21:14:59 +0000 UTC" firstStartedPulling="2026-03-07 21:15:11.057452063 +0000 UTC m=+44.609778955" lastFinishedPulling="2026-03-07 21:15:14.034227949 +0000 UTC m=+47.586554841" observedRunningTime="2026-03-07 21:15:15.315532594 +0000 UTC m=+48.867859506" watchObservedRunningTime="2026-03-07 21:15:15.317930417 +0000 UTC m=+48.870257309" Mar 07 21:15:15.408958 master-0 kubenswrapper[7689]: I0307 21:15:15.408858 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:15.572313 master-0 kubenswrapper[7689]: I0307 21:15:15.572153 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:16.335945 master-0 kubenswrapper[7689]: I0307 21:15:16.335872 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:16.336649 master-0 kubenswrapper[7689]: I0307 21:15:16.336256 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-2-master-0" podUID="16204ee5-568e-453c-90c0-8966cb2b9d88" containerName="installer" containerID="cri-o://a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f" gracePeriod=30 Mar 07 21:15:16.916368 master-0 kubenswrapper[7689]: I0307 21:15:16.916207 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:16.916821 master-0 kubenswrapper[7689]: I0307 21:15:16.916734 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerName="controller-manager" containerID="cri-o://350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5" gracePeriod=30 Mar 07 21:15:16.916992 master-0 kubenswrapper[7689]: I0307 21:15:16.916952 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:18.067918 master-0 kubenswrapper[7689]: I0307 21:15:18.067873 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:18.076164 master-0 kubenswrapper[7689]: I0307 21:15:18.075862 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_16204ee5-568e-453c-90c0-8966cb2b9d88/installer/0.log" Mar 07 21:15:18.076164 master-0 kubenswrapper[7689]: I0307 21:15:18.075944 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.107136 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: E0307 21:15:18.107387 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerName="controller-manager" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.107406 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerName="controller-manager" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: E0307 21:15:18.107428 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16204ee5-568e-453c-90c0-8966cb2b9d88" containerName="installer" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.107437 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="16204ee5-568e-453c-90c0-8966cb2b9d88" containerName="installer" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.107544 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="16204ee5-568e-453c-90c0-8966cb2b9d88" containerName="installer" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.107564 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerName="controller-manager" Mar 07 21:15:18.115747 master-0 kubenswrapper[7689]: I0307 21:15:18.108122 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.136896 master-0 kubenswrapper[7689]: I0307 21:15:18.136777 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:18.179171 master-0 kubenswrapper[7689]: I0307 21:15:18.179036 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock\") pod \"16204ee5-568e-453c-90c0-8966cb2b9d88\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " Mar 07 21:15:18.179171 master-0 kubenswrapper[7689]: I0307 21:15:18.179134 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access\") pod \"16204ee5-568e-453c-90c0-8966cb2b9d88\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " Mar 07 21:15:18.179378 master-0 kubenswrapper[7689]: I0307 21:15:18.179208 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xvqqg\" (UniqueName: \"kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg\") pod \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " Mar 07 21:15:18.179378 master-0 kubenswrapper[7689]: I0307 21:15:18.179237 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir\") pod \"16204ee5-568e-453c-90c0-8966cb2b9d88\" (UID: \"16204ee5-568e-453c-90c0-8966cb2b9d88\") " Mar 07 21:15:18.179378 master-0 kubenswrapper[7689]: I0307 21:15:18.179297 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles\") pod \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " Mar 07 21:15:18.179378 master-0 kubenswrapper[7689]: I0307 21:15:18.179333 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca\") pod \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " Mar 07 21:15:18.179502 master-0 kubenswrapper[7689]: I0307 21:15:18.179397 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config\") pod \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " Mar 07 21:15:18.179502 master-0 kubenswrapper[7689]: I0307 21:15:18.179436 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert\") pod \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\" (UID: \"2d15cce0-2fc4-44ad-afac-038a93e34ae9\") " Mar 07 21:15:18.179810 master-0 kubenswrapper[7689]: I0307 21:15:18.179626 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.179810 master-0 kubenswrapper[7689]: I0307 21:15:18.179724 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.179810 master-0 kubenswrapper[7689]: I0307 21:15:18.179751 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.179810 master-0 kubenswrapper[7689]: I0307 21:15:18.179793 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-982pp\" (UniqueName: \"kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.179938 master-0 kubenswrapper[7689]: I0307 21:15:18.179821 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.179938 master-0 kubenswrapper[7689]: I0307 21:15:18.179924 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock" (OuterVolumeSpecName: "var-lock") pod "16204ee5-568e-453c-90c0-8966cb2b9d88" (UID: "16204ee5-568e-453c-90c0-8966cb2b9d88"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:18.180255 master-0 kubenswrapper[7689]: I0307 21:15:18.180205 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "16204ee5-568e-453c-90c0-8966cb2b9d88" (UID: "16204ee5-568e-453c-90c0-8966cb2b9d88"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:18.181063 master-0 kubenswrapper[7689]: I0307 21:15:18.180862 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config" (OuterVolumeSpecName: "config") pod "2d15cce0-2fc4-44ad-afac-038a93e34ae9" (UID: "2d15cce0-2fc4-44ad-afac-038a93e34ae9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:18.181063 master-0 kubenswrapper[7689]: I0307 21:15:18.180951 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "2d15cce0-2fc4-44ad-afac-038a93e34ae9" (UID: "2d15cce0-2fc4-44ad-afac-038a93e34ae9"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:18.181063 master-0 kubenswrapper[7689]: I0307 21:15:18.181028 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca" (OuterVolumeSpecName: "client-ca") pod "2d15cce0-2fc4-44ad-afac-038a93e34ae9" (UID: "2d15cce0-2fc4-44ad-afac-038a93e34ae9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:18.185698 master-0 kubenswrapper[7689]: I0307 21:15:18.183587 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2d15cce0-2fc4-44ad-afac-038a93e34ae9" (UID: "2d15cce0-2fc4-44ad-afac-038a93e34ae9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:18.185983 master-0 kubenswrapper[7689]: I0307 21:15:18.185756 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg" (OuterVolumeSpecName: "kube-api-access-xvqqg") pod "2d15cce0-2fc4-44ad-afac-038a93e34ae9" (UID: "2d15cce0-2fc4-44ad-afac-038a93e34ae9"). InnerVolumeSpecName "kube-api-access-xvqqg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:18.185983 master-0 kubenswrapper[7689]: I0307 21:15:18.185823 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "16204ee5-568e-453c-90c0-8966cb2b9d88" (UID: "16204ee5-568e-453c-90c0-8966cb2b9d88"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:18.239399 master-0 kubenswrapper[7689]: I0307 21:15:18.239327 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:15:18.280844 master-0 kubenswrapper[7689]: I0307 21:15:18.280765 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.280992 master-0 kubenswrapper[7689]: I0307 21:15:18.280882 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.280992 master-0 kubenswrapper[7689]: I0307 21:15:18.280973 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.281096 master-0 kubenswrapper[7689]: I0307 21:15:18.281007 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.281096 master-0 kubenswrapper[7689]: I0307 21:15:18.281060 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-982pp\" (UniqueName: \"kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.281189 master-0 kubenswrapper[7689]: I0307 21:15:18.281123 7689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281189 master-0 kubenswrapper[7689]: I0307 21:15:18.281140 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281189 master-0 kubenswrapper[7689]: I0307 21:15:18.281155 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d15cce0-2fc4-44ad-afac-038a93e34ae9-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281189 master-0 kubenswrapper[7689]: I0307 21:15:18.281170 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d15cce0-2fc4-44ad-afac-038a93e34ae9-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281189 master-0 kubenswrapper[7689]: I0307 21:15:18.281185 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281388 master-0 kubenswrapper[7689]: I0307 21:15:18.281198 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/16204ee5-568e-453c-90c0-8966cb2b9d88-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281388 master-0 kubenswrapper[7689]: I0307 21:15:18.281214 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xvqqg\" (UniqueName: \"kubernetes.io/projected/2d15cce0-2fc4-44ad-afac-038a93e34ae9-kube-api-access-xvqqg\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.281388 master-0 kubenswrapper[7689]: I0307 21:15:18.281229 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/16204ee5-568e-453c-90c0-8966cb2b9d88-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.283384 master-0 kubenswrapper[7689]: I0307 21:15:18.283344 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.284482 master-0 kubenswrapper[7689]: I0307 21:15:18.284440 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.285852 master-0 kubenswrapper[7689]: I0307 21:15:18.285806 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.286566 master-0 kubenswrapper[7689]: I0307 21:15:18.286525 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.301823 master-0 kubenswrapper[7689]: I0307 21:15:18.301774 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-982pp\" (UniqueName: \"kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp\") pod \"controller-manager-64655dcbb9-bp4zn\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.316403 master-0 kubenswrapper[7689]: I0307 21:15:18.316349 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-2-master-0_16204ee5-568e-453c-90c0-8966cb2b9d88/installer/0.log" Mar 07 21:15:18.316530 master-0 kubenswrapper[7689]: I0307 21:15:18.316401 7689 generic.go:334] "Generic (PLEG): container finished" podID="16204ee5-568e-453c-90c0-8966cb2b9d88" containerID="a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f" exitCode=1 Mar 07 21:15:18.316530 master-0 kubenswrapper[7689]: I0307 21:15:18.316517 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-2-master-0" Mar 07 21:15:18.317321 master-0 kubenswrapper[7689]: I0307 21:15:18.316739 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"16204ee5-568e-453c-90c0-8966cb2b9d88","Type":"ContainerDied","Data":"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f"} Mar 07 21:15:18.317321 master-0 kubenswrapper[7689]: I0307 21:15:18.316802 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-2-master-0" event={"ID":"16204ee5-568e-453c-90c0-8966cb2b9d88","Type":"ContainerDied","Data":"361657c5468b73dbe4d6bbc10059a3a8e6d93b0d11c54b62883ab0c86559f65a"} Mar 07 21:15:18.317321 master-0 kubenswrapper[7689]: I0307 21:15:18.316854 7689 scope.go:117] "RemoveContainer" containerID="a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f" Mar 07 21:15:18.318556 master-0 kubenswrapper[7689]: I0307 21:15:18.318419 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"576e332a-c381-4582-bb5e-02d32bb376a4","Type":"ContainerStarted","Data":"27583492499e035b40b8f072f078cc77a6db2ea1938b124426f193c21478705d"} Mar 07 21:15:18.325130 master-0 kubenswrapper[7689]: I0307 21:15:18.325052 7689 generic.go:334] "Generic (PLEG): container finished" podID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" containerID="350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5" exitCode=0 Mar 07 21:15:18.325278 master-0 kubenswrapper[7689]: I0307 21:15:18.325160 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" Mar 07 21:15:18.325278 master-0 kubenswrapper[7689]: I0307 21:15:18.325221 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" event={"ID":"2d15cce0-2fc4-44ad-afac-038a93e34ae9","Type":"ContainerDied","Data":"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5"} Mar 07 21:15:18.325278 master-0 kubenswrapper[7689]: I0307 21:15:18.325265 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f49f8b76c-p7dfh" event={"ID":"2d15cce0-2fc4-44ad-afac-038a93e34ae9","Type":"ContainerDied","Data":"5b9dc197367ff3c95b79be8d47728e9c0e3edf2f81deec1d18569373f08b6bf4"} Mar 07 21:15:18.327860 master-0 kubenswrapper[7689]: I0307 21:15:18.327802 7689 generic.go:334] "Generic (PLEG): container finished" podID="7d462ed3-d191-42a5-b8e0-79ab9af13991" containerID="0d151e14131d9760e6564e5100dd52c146e8d9e99a88e5e1621708256085d68d" exitCode=0 Mar 07 21:15:18.327860 master-0 kubenswrapper[7689]: I0307 21:15:18.327885 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" event={"ID":"7d462ed3-d191-42a5-b8e0-79ab9af13991","Type":"ContainerDied","Data":"0d151e14131d9760e6564e5100dd52c146e8d9e99a88e5e1621708256085d68d"} Mar 07 21:15:18.330823 master-0 kubenswrapper[7689]: I0307 21:15:18.330621 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hm77f" event={"ID":"4e94f64e-4a89-4d9d-acbd-80f86bf2f964","Type":"ContainerStarted","Data":"ac7ebdfacdfbafe18a6d91d20267874c89b59c5ddf36455a3982315822b46467"} Mar 07 21:15:18.332917 master-0 kubenswrapper[7689]: I0307 21:15:18.332864 7689 generic.go:334] "Generic (PLEG): container finished" podID="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" containerID="00d0a15073fd7fa3444cc6741cebfa512af7efaa071a744a0077952511813908" exitCode=0 Mar 07 21:15:18.333025 master-0 kubenswrapper[7689]: I0307 21:15:18.332943 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-694d775589-btnh4" event={"ID":"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9","Type":"ContainerDied","Data":"00d0a15073fd7fa3444cc6741cebfa512af7efaa071a744a0077952511813908"} Mar 07 21:15:18.333226 master-0 kubenswrapper[7689]: I0307 21:15:18.333181 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" podUID="7e890559-2ff3-40aa-96ef-eeb997030eb6" containerName="route-controller-manager" containerID="cri-o://bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52" gracePeriod=30 Mar 07 21:15:18.341810 master-0 kubenswrapper[7689]: I0307 21:15:18.341738 7689 scope.go:117] "RemoveContainer" containerID="a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f" Mar 07 21:15:18.342050 master-0 kubenswrapper[7689]: I0307 21:15:18.341972 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:18.342431 master-0 kubenswrapper[7689]: E0307 21:15:18.342385 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f\": container with ID starting with a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f not found: ID does not exist" containerID="a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f" Mar 07 21:15:18.342632 master-0 kubenswrapper[7689]: I0307 21:15:18.342566 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f"} err="failed to get container status \"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f\": rpc error: code = NotFound desc = could not find container \"a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f\": container with ID starting with a38eb80e4f70527181fde947140bede2f1b45407349074b2ca478900b2d60d6f not found: ID does not exist" Mar 07 21:15:18.342802 master-0 kubenswrapper[7689]: I0307 21:15:18.342780 7689 scope.go:117] "RemoveContainer" containerID="350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5" Mar 07 21:15:18.343624 master-0 kubenswrapper[7689]: I0307 21:15:18.343576 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.365004 master-0 kubenswrapper[7689]: I0307 21:15:18.364914 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:18.395452 master-0 kubenswrapper[7689]: I0307 21:15:18.395411 7689 scope.go:117] "RemoveContainer" containerID="350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5" Mar 07 21:15:18.397266 master-0 kubenswrapper[7689]: E0307 21:15:18.397214 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5\": container with ID starting with 350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5 not found: ID does not exist" containerID="350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5" Mar 07 21:15:18.397358 master-0 kubenswrapper[7689]: I0307 21:15:18.397282 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5"} err="failed to get container status \"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5\": rpc error: code = NotFound desc = could not find container \"350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5\": container with ID starting with 350e35c1c8ec5a092d2c7c275121f0aaa18826d1e183f11e9173360377b9d2d5 not found: ID does not exist" Mar 07 21:15:18.434050 master-0 kubenswrapper[7689]: I0307 21:15:18.433888 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:18.434904 master-0 kubenswrapper[7689]: I0307 21:15:18.434853 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f49f8b76c-p7dfh"] Mar 07 21:15:18.446234 master-0 kubenswrapper[7689]: I0307 21:15:18.446176 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:18.452588 master-0 kubenswrapper[7689]: I0307 21:15:18.452536 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-2-master-0"] Mar 07 21:15:18.488607 master-0 kubenswrapper[7689]: I0307 21:15:18.488090 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.488607 master-0 kubenswrapper[7689]: I0307 21:15:18.488257 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.488607 master-0 kubenswrapper[7689]: I0307 21:15:18.488489 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.537652 master-0 kubenswrapper[7689]: I0307 21:15:18.537589 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:18.590062 master-0 kubenswrapper[7689]: I0307 21:15:18.590014 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.590161 master-0 kubenswrapper[7689]: I0307 21:15:18.590082 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.590385 master-0 kubenswrapper[7689]: I0307 21:15:18.590322 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.590457 master-0 kubenswrapper[7689]: I0307 21:15:18.590434 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.590513 master-0 kubenswrapper[7689]: I0307 21:15:18.590421 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.612747 master-0 kubenswrapper[7689]: I0307 21:15:18.611792 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access\") pod \"installer-3-master-0\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.684646 master-0 kubenswrapper[7689]: I0307 21:15:18.684568 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:18.692303 master-0 kubenswrapper[7689]: I0307 21:15:18.692245 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16204ee5-568e-453c-90c0-8966cb2b9d88" path="/var/lib/kubelet/pods/16204ee5-568e-453c-90c0-8966cb2b9d88/volumes" Mar 07 21:15:18.693493 master-0 kubenswrapper[7689]: I0307 21:15:18.693209 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d15cce0-2fc4-44ad-afac-038a93e34ae9" path="/var/lib/kubelet/pods/2d15cce0-2fc4-44ad-afac-038a93e34ae9/volumes" Mar 07 21:15:18.792910 master-0 kubenswrapper[7689]: I0307 21:15:18.792871 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:18.893485 master-0 kubenswrapper[7689]: I0307 21:15:18.893411 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9pjqr\" (UniqueName: \"kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr\") pod \"7e890559-2ff3-40aa-96ef-eeb997030eb6\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " Mar 07 21:15:18.893485 master-0 kubenswrapper[7689]: I0307 21:15:18.893487 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca\") pod \"7e890559-2ff3-40aa-96ef-eeb997030eb6\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " Mar 07 21:15:18.893848 master-0 kubenswrapper[7689]: I0307 21:15:18.893586 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") pod \"7e890559-2ff3-40aa-96ef-eeb997030eb6\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " Mar 07 21:15:18.893848 master-0 kubenswrapper[7689]: I0307 21:15:18.893650 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config\") pod \"7e890559-2ff3-40aa-96ef-eeb997030eb6\" (UID: \"7e890559-2ff3-40aa-96ef-eeb997030eb6\") " Mar 07 21:15:18.894727 master-0 kubenswrapper[7689]: I0307 21:15:18.894661 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config" (OuterVolumeSpecName: "config") pod "7e890559-2ff3-40aa-96ef-eeb997030eb6" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:18.895748 master-0 kubenswrapper[7689]: I0307 21:15:18.895703 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca" (OuterVolumeSpecName: "client-ca") pod "7e890559-2ff3-40aa-96ef-eeb997030eb6" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:18.904384 master-0 kubenswrapper[7689]: I0307 21:15:18.904301 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr" (OuterVolumeSpecName: "kube-api-access-9pjqr") pod "7e890559-2ff3-40aa-96ef-eeb997030eb6" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6"). InnerVolumeSpecName "kube-api-access-9pjqr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:18.905674 master-0 kubenswrapper[7689]: I0307 21:15:18.905623 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7e890559-2ff3-40aa-96ef-eeb997030eb6" (UID: "7e890559-2ff3-40aa-96ef-eeb997030eb6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:18.995638 master-0 kubenswrapper[7689]: I0307 21:15:18.995593 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.995638 master-0 kubenswrapper[7689]: I0307 21:15:18.995635 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9pjqr\" (UniqueName: \"kubernetes.io/projected/7e890559-2ff3-40aa-96ef-eeb997030eb6-kube-api-access-9pjqr\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.995845 master-0 kubenswrapper[7689]: I0307 21:15:18.995649 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7e890559-2ff3-40aa-96ef-eeb997030eb6-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:18.995845 master-0 kubenswrapper[7689]: I0307 21:15:18.995662 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7e890559-2ff3-40aa-96ef-eeb997030eb6-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:19.078739 master-0 kubenswrapper[7689]: I0307 21:15:19.078635 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:19.086026 master-0 kubenswrapper[7689]: W0307 21:15:19.085967 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83377bd5_67a6_4108_b3ac_a3d338813fc1.slice/crio-c8d2c0237170ab58ef6b21b5bf0bf18e7f397acaddfef52805db630c8eaf1d53 WatchSource:0}: Error finding container c8d2c0237170ab58ef6b21b5bf0bf18e7f397acaddfef52805db630c8eaf1d53: Status 404 returned error can't find the container with id c8d2c0237170ab58ef6b21b5bf0bf18e7f397acaddfef52805db630c8eaf1d53 Mar 07 21:15:19.134380 master-0 kubenswrapper[7689]: I0307 21:15:19.134314 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:19.350850 master-0 kubenswrapper[7689]: I0307 21:15:19.350798 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"576e332a-c381-4582-bb5e-02d32bb376a4","Type":"ContainerStarted","Data":"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c"} Mar 07 21:15:19.354898 master-0 kubenswrapper[7689]: I0307 21:15:19.354876 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" event={"ID":"7d462ed3-d191-42a5-b8e0-79ab9af13991","Type":"ContainerStarted","Data":"bc19ce923f9e7be34ac1703c77a7fa3df03c790dead075489babb3a43ed5693c"} Mar 07 21:15:19.365434 master-0 kubenswrapper[7689]: I0307 21:15:19.365344 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-hm77f" event={"ID":"4e94f64e-4a89-4d9d-acbd-80f86bf2f964","Type":"ContainerStarted","Data":"b38e2094c92ea8696343a2c8074ab509ad8ebd59efba96c3245a7ea95d7c83a0"} Mar 07 21:15:19.365714 master-0 kubenswrapper[7689]: I0307 21:15:19.365671 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:19.373041 master-0 kubenswrapper[7689]: I0307 21:15:19.372947 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-694d775589-btnh4" event={"ID":"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9","Type":"ContainerStarted","Data":"d8836ebedb4515b4f09a6a21f0c906ce28fbfab06333aebef63266c85f551a3c"} Mar 07 21:15:19.373234 master-0 kubenswrapper[7689]: I0307 21:15:19.373084 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-694d775589-btnh4" event={"ID":"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9","Type":"ContainerStarted","Data":"626ba943325c6852acf60f9a1ea55c7b7cafc4c800af7a6bd832b6f67db56e59"} Mar 07 21:15:19.374354 master-0 kubenswrapper[7689]: I0307 21:15:19.374312 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e969ada0-e795-4802-a212-aeabf75de371","Type":"ContainerStarted","Data":"74a7e4f4531cb3594a45acf0ff68e982eea538a5e2a9de26ff5bfc8c54f201b3"} Mar 07 21:15:19.375708 master-0 kubenswrapper[7689]: I0307 21:15:19.375626 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" event={"ID":"83377bd5-67a6-4108-b3ac-a3d338813fc1","Type":"ContainerStarted","Data":"cf850f29efe8ab776668f6d57f50250ae977e92fba1ea0538e7ff3a1dfc8b3ac"} Mar 07 21:15:19.375779 master-0 kubenswrapper[7689]: I0307 21:15:19.375717 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" event={"ID":"83377bd5-67a6-4108-b3ac-a3d338813fc1","Type":"ContainerStarted","Data":"c8d2c0237170ab58ef6b21b5bf0bf18e7f397acaddfef52805db630c8eaf1d53"} Mar 07 21:15:19.376649 master-0 kubenswrapper[7689]: I0307 21:15:19.376494 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:19.377651 master-0 kubenswrapper[7689]: I0307 21:15:19.377614 7689 generic.go:334] "Generic (PLEG): container finished" podID="7e890559-2ff3-40aa-96ef-eeb997030eb6" containerID="bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52" exitCode=0 Mar 07 21:15:19.377651 master-0 kubenswrapper[7689]: I0307 21:15:19.377645 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" event={"ID":"7e890559-2ff3-40aa-96ef-eeb997030eb6","Type":"ContainerDied","Data":"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52"} Mar 07 21:15:19.377755 master-0 kubenswrapper[7689]: I0307 21:15:19.377660 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" event={"ID":"7e890559-2ff3-40aa-96ef-eeb997030eb6","Type":"ContainerDied","Data":"467d1516c6f7433149a21563253a86c3e2063ff1808ade15bd19198e0649b603"} Mar 07 21:15:19.377755 master-0 kubenswrapper[7689]: I0307 21:15:19.377689 7689 scope.go:117] "RemoveContainer" containerID="bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52" Mar 07 21:15:19.378048 master-0 kubenswrapper[7689]: I0307 21:15:19.378019 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s" Mar 07 21:15:19.380347 master-0 kubenswrapper[7689]: I0307 21:15:19.380085 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-1-master-0" podStartSLOduration=4.3800713909999995 podStartE2EDuration="4.380071391s" podCreationTimestamp="2026-03-07 21:15:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:19.373541191 +0000 UTC m=+52.925868093" watchObservedRunningTime="2026-03-07 21:15:19.380071391 +0000 UTC m=+52.932398283" Mar 07 21:15:19.382632 master-0 kubenswrapper[7689]: I0307 21:15:19.382493 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:19.399079 master-0 kubenswrapper[7689]: I0307 21:15:19.399024 7689 scope.go:117] "RemoveContainer" containerID="bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52" Mar 07 21:15:19.401817 master-0 kubenswrapper[7689]: I0307 21:15:19.401715 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" podStartSLOduration=3.75626026 podStartE2EDuration="10.401695644s" podCreationTimestamp="2026-03-07 21:15:09 +0000 UTC" firstStartedPulling="2026-03-07 21:15:11.098624256 +0000 UTC m=+44.650951148" lastFinishedPulling="2026-03-07 21:15:17.74405964 +0000 UTC m=+51.296386532" observedRunningTime="2026-03-07 21:15:19.400602496 +0000 UTC m=+52.952929388" watchObservedRunningTime="2026-03-07 21:15:19.401695644 +0000 UTC m=+52.954022566" Mar 07 21:15:19.402382 master-0 kubenswrapper[7689]: E0307 21:15:19.402339 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52\": container with ID starting with bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52 not found: ID does not exist" containerID="bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52" Mar 07 21:15:19.402433 master-0 kubenswrapper[7689]: I0307 21:15:19.402393 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52"} err="failed to get container status \"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52\": rpc error: code = NotFound desc = could not find container \"bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52\": container with ID starting with bb9df8c11e1f432584b936b6bdce87599c3f2b4a893a0103b52bb48c94d45d52 not found: ID does not exist" Mar 07 21:15:19.430258 master-0 kubenswrapper[7689]: I0307 21:15:19.430170 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-hm77f" podStartSLOduration=3.591156894 podStartE2EDuration="8.430141936s" podCreationTimestamp="2026-03-07 21:15:11 +0000 UTC" firstStartedPulling="2026-03-07 21:15:12.877973582 +0000 UTC m=+46.430300474" lastFinishedPulling="2026-03-07 21:15:17.716958624 +0000 UTC m=+51.269285516" observedRunningTime="2026-03-07 21:15:19.428561115 +0000 UTC m=+52.980887997" watchObservedRunningTime="2026-03-07 21:15:19.430141936 +0000 UTC m=+52.982468828" Mar 07 21:15:19.450635 master-0 kubenswrapper[7689]: I0307 21:15:19.450032 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-694d775589-btnh4" podStartSLOduration=9.759897814 podStartE2EDuration="16.450010074s" podCreationTimestamp="2026-03-07 21:15:03 +0000 UTC" firstStartedPulling="2026-03-07 21:15:11.08076614 +0000 UTC m=+44.633093032" lastFinishedPulling="2026-03-07 21:15:17.7708784 +0000 UTC m=+51.323205292" observedRunningTime="2026-03-07 21:15:19.449547462 +0000 UTC m=+53.001874384" watchObservedRunningTime="2026-03-07 21:15:19.450010074 +0000 UTC m=+53.002336976" Mar 07 21:15:19.461299 master-0 kubenswrapper[7689]: I0307 21:15:19.460590 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:19.465068 master-0 kubenswrapper[7689]: I0307 21:15:19.465029 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7c8cdf56b5-h464s"] Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: I0307 21:15:19.650115 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" podStartSLOduration=3.650094309 podStartE2EDuration="3.650094309s" podCreationTimestamp="2026-03-07 21:15:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:19.487379408 +0000 UTC m=+53.039706300" watchObservedRunningTime="2026-03-07 21:15:19.650094309 +0000 UTC m=+53.202421201" Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: I0307 21:15:19.650353 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: E0307 21:15:19.650542 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7e890559-2ff3-40aa-96ef-eeb997030eb6" containerName="route-controller-manager" Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: I0307 21:15:19.650553 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="7e890559-2ff3-40aa-96ef-eeb997030eb6" containerName="route-controller-manager" Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: I0307 21:15:19.650630 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="7e890559-2ff3-40aa-96ef-eeb997030eb6" containerName="route-controller-manager" Mar 07 21:15:19.652744 master-0 kubenswrapper[7689]: I0307 21:15:19.650940 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.653317 master-0 kubenswrapper[7689]: I0307 21:15:19.653258 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 21:15:19.665707 master-0 kubenswrapper[7689]: I0307 21:15:19.664046 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 07 21:15:19.717719 master-0 kubenswrapper[7689]: I0307 21:15:19.715215 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.717719 master-0 kubenswrapper[7689]: I0307 21:15:19.715302 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.717719 master-0 kubenswrapper[7689]: I0307 21:15:19.715565 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.816908 master-0 kubenswrapper[7689]: I0307 21:15:19.816816 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.817175 master-0 kubenswrapper[7689]: I0307 21:15:19.817011 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.817261 master-0 kubenswrapper[7689]: I0307 21:15:19.817204 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.817452 master-0 kubenswrapper[7689]: I0307 21:15:19.817414 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.817452 master-0 kubenswrapper[7689]: I0307 21:15:19.817391 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.838856 master-0 kubenswrapper[7689]: I0307 21:15:19.837245 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access\") pod \"installer-1-master-0\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:19.944795 master-0 kubenswrapper[7689]: I0307 21:15:19.944581 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:19.944795 master-0 kubenswrapper[7689]: I0307 21:15:19.944651 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:19.955871 master-0 kubenswrapper[7689]: I0307 21:15:19.955827 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:19.981526 master-0 kubenswrapper[7689]: I0307 21:15:19.981447 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:15:20.321833 master-0 kubenswrapper[7689]: I0307 21:15:20.321704 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:20.322435 master-0 kubenswrapper[7689]: I0307 21:15:20.322407 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.325605 master-0 kubenswrapper[7689]: I0307 21:15:20.325565 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:15:20.325745 master-0 kubenswrapper[7689]: I0307 21:15:20.325606 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:15:20.325745 master-0 kubenswrapper[7689]: I0307 21:15:20.325675 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:15:20.325876 master-0 kubenswrapper[7689]: I0307 21:15:20.325816 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:15:20.327387 master-0 kubenswrapper[7689]: I0307 21:15:20.327305 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:15:20.337013 master-0 kubenswrapper[7689]: I0307 21:15:20.336943 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:20.396716 master-0 kubenswrapper[7689]: I0307 21:15:20.396576 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e969ada0-e795-4802-a212-aeabf75de371","Type":"ContainerStarted","Data":"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b"} Mar 07 21:15:20.405896 master-0 kubenswrapper[7689]: I0307 21:15:20.405829 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:15:20.412529 master-0 kubenswrapper[7689]: I0307 21:15:20.412442 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-3-master-0" podStartSLOduration=2.412419178 podStartE2EDuration="2.412419178s" podCreationTimestamp="2026-03-07 21:15:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:20.411331829 +0000 UTC m=+53.963658761" watchObservedRunningTime="2026-03-07 21:15:20.412419178 +0000 UTC m=+53.964746070" Mar 07 21:15:20.427613 master-0 kubenswrapper[7689]: I0307 21:15:20.427073 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.427613 master-0 kubenswrapper[7689]: I0307 21:15:20.427256 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.427613 master-0 kubenswrapper[7689]: I0307 21:15:20.427318 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.427613 master-0 kubenswrapper[7689]: I0307 21:15:20.427355 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cjp4\" (UniqueName: \"kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.453910 master-0 kubenswrapper[7689]: I0307 21:15:20.453735 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-master-0"] Mar 07 21:15:20.530712 master-0 kubenswrapper[7689]: I0307 21:15:20.529724 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.530712 master-0 kubenswrapper[7689]: I0307 21:15:20.529880 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.530712 master-0 kubenswrapper[7689]: I0307 21:15:20.529915 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cjp4\" (UniqueName: \"kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.530712 master-0 kubenswrapper[7689]: I0307 21:15:20.529948 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.542719 master-0 kubenswrapper[7689]: I0307 21:15:20.534443 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.542719 master-0 kubenswrapper[7689]: I0307 21:15:20.536776 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.542719 master-0 kubenswrapper[7689]: I0307 21:15:20.540432 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.568712 master-0 kubenswrapper[7689]: I0307 21:15:20.567834 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cjp4\" (UniqueName: \"kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4\") pod \"route-controller-manager-7df7f5b8c-5rhtx\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.654748 master-0 kubenswrapper[7689]: I0307 21:15:20.651120 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:20.691009 master-0 kubenswrapper[7689]: I0307 21:15:20.690952 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7e890559-2ff3-40aa-96ef-eeb997030eb6" path="/var/lib/kubelet/pods/7e890559-2ff3-40aa-96ef-eeb997030eb6/volumes" Mar 07 21:15:21.059725 master-0 kubenswrapper[7689]: I0307 21:15:21.059629 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:21.069694 master-0 kubenswrapper[7689]: W0307 21:15:21.069619 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9837e9f_f72d_44b7_8f75_abe00884bff6.slice/crio-b0389161dec8c01cd7d1f79a8e7299ae47e99a77f1756c782ea9e577c8f4185b WatchSource:0}: Error finding container b0389161dec8c01cd7d1f79a8e7299ae47e99a77f1756c782ea9e577c8f4185b: Status 404 returned error can't find the container with id b0389161dec8c01cd7d1f79a8e7299ae47e99a77f1756c782ea9e577c8f4185b Mar 07 21:15:21.404007 master-0 kubenswrapper[7689]: I0307 21:15:21.403735 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" event={"ID":"a9837e9f-f72d-44b7-8f75-abe00884bff6","Type":"ContainerStarted","Data":"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38"} Mar 07 21:15:21.404963 master-0 kubenswrapper[7689]: I0307 21:15:21.404027 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:21.404963 master-0 kubenswrapper[7689]: I0307 21:15:21.404048 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" event={"ID":"a9837e9f-f72d-44b7-8f75-abe00884bff6","Type":"ContainerStarted","Data":"b0389161dec8c01cd7d1f79a8e7299ae47e99a77f1756c782ea9e577c8f4185b"} Mar 07 21:15:21.413721 master-0 kubenswrapper[7689]: I0307 21:15:21.413171 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"34e734b7-82d6-493d-ace8-1945b2c08c6d","Type":"ContainerStarted","Data":"b18addaef135e00fefdd51e68b734344679afa8f4606f39797d35e107db0fa22"} Mar 07 21:15:21.413721 master-0 kubenswrapper[7689]: I0307 21:15:21.413253 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"34e734b7-82d6-493d-ace8-1945b2c08c6d","Type":"ContainerStarted","Data":"6de23860b0b81dd71d1a71f02b3b23b5ac8368494a9752dfe36eb798dc3827b1"} Mar 07 21:15:21.435390 master-0 kubenswrapper[7689]: I0307 21:15:21.435296 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" podStartSLOduration=4.435269437 podStartE2EDuration="4.435269437s" podCreationTimestamp="2026-03-07 21:15:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:21.43423516 +0000 UTC m=+54.986562092" watchObservedRunningTime="2026-03-07 21:15:21.435269437 +0000 UTC m=+54.987596339" Mar 07 21:15:21.456828 master-0 kubenswrapper[7689]: I0307 21:15:21.454814 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-master-0" podStartSLOduration=2.454794206 podStartE2EDuration="2.454794206s" podCreationTimestamp="2026-03-07 21:15:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:21.453183414 +0000 UTC m=+55.005510316" watchObservedRunningTime="2026-03-07 21:15:21.454794206 +0000 UTC m=+55.007121098" Mar 07 21:15:21.918635 master-0 kubenswrapper[7689]: I0307 21:15:21.918569 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:21.918907 master-0 kubenswrapper[7689]: I0307 21:15:21.918649 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: I0307 21:15:21.928103 7689 patch_prober.go:28] interesting pod/apiserver-694d775589-btnh4 container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]log ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]etcd ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/generic-apiserver-start-informers ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/max-in-flight-filter ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/project.openshift.io-projectcache ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/openshift.io-startinformers ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 07 21:15:21.928174 master-0 kubenswrapper[7689]: livez check failed Mar 07 21:15:21.928936 master-0 kubenswrapper[7689]: I0307 21:15:21.928188 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-694d775589-btnh4" podUID="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:15:22.001094 master-0 kubenswrapper[7689]: I0307 21:15:22.001015 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:26.206557 master-0 kubenswrapper[7689]: I0307 21:15:26.206470 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4"] Mar 07 21:15:26.214509 master-0 kubenswrapper[7689]: I0307 21:15:26.209496 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" containerID="cri-o://f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b" gracePeriod=130 Mar 07 21:15:26.467778 master-0 kubenswrapper[7689]: I0307 21:15:26.465383 7689 generic.go:334] "Generic (PLEG): container finished" podID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerID="f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b" exitCode=0 Mar 07 21:15:26.467778 master-0 kubenswrapper[7689]: I0307 21:15:26.465440 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" event={"ID":"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b","Type":"ContainerDied","Data":"f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b"} Mar 07 21:15:26.467778 master-0 kubenswrapper[7689]: I0307 21:15:26.465475 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" event={"ID":"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b","Type":"ContainerDied","Data":"67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2"} Mar 07 21:15:26.467778 master-0 kubenswrapper[7689]: I0307 21:15:26.465490 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2" Mar 07 21:15:26.489196 master-0 kubenswrapper[7689]: I0307 21:15:26.489157 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:15:26.650863 master-0 kubenswrapper[7689]: I0307 21:15:26.649930 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") pod \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " Mar 07 21:15:26.651176 master-0 kubenswrapper[7689]: I0307 21:15:26.650807 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs" (OuterVolumeSpecName: "etc-ssl-certs") pod "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b"). InnerVolumeSpecName "etc-ssl-certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:26.651785 master-0 kubenswrapper[7689]: I0307 21:15:26.651464 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") pod \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " Mar 07 21:15:26.652474 master-0 kubenswrapper[7689]: I0307 21:15:26.652438 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca" (OuterVolumeSpecName: "service-ca") pod "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:26.652782 master-0 kubenswrapper[7689]: I0307 21:15:26.651574 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") pod \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " Mar 07 21:15:26.652998 master-0 kubenswrapper[7689]: I0307 21:15:26.652962 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") pod \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " Mar 07 21:15:26.653610 master-0 kubenswrapper[7689]: I0307 21:15:26.653456 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") pod \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\" (UID: \"3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b\") " Mar 07 21:15:26.653610 master-0 kubenswrapper[7689]: I0307 21:15:26.653084 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads" (OuterVolumeSpecName: "etc-cvo-updatepayloads") pod "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b"). InnerVolumeSpecName "etc-cvo-updatepayloads". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:26.653936 master-0 kubenswrapper[7689]: I0307 21:15:26.653916 7689 reconciler_common.go:293] "Volume detached for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-ssl-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:26.653936 master-0 kubenswrapper[7689]: I0307 21:15:26.653932 7689 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:26.654012 master-0 kubenswrapper[7689]: I0307 21:15:26.653942 7689 reconciler_common.go:293] "Volume detached for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-etc-cvo-updatepayloads\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:26.657118 master-0 kubenswrapper[7689]: I0307 21:15:26.657060 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:26.658635 master-0 kubenswrapper[7689]: I0307 21:15:26.658593 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" (UID: "3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:26.755578 master-0 kubenswrapper[7689]: I0307 21:15:26.755505 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:26.755578 master-0 kubenswrapper[7689]: I0307 21:15:26.755546 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:26.931960 master-0 kubenswrapper[7689]: I0307 21:15:26.931894 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:26.944836 master-0 kubenswrapper[7689]: I0307 21:15:26.944757 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:15:27.470356 master-0 kubenswrapper[7689]: I0307 21:15:27.470301 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4" Mar 07 21:15:27.497767 master-0 kubenswrapper[7689]: I0307 21:15:27.497643 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4"] Mar 07 21:15:27.508004 master-0 kubenswrapper[7689]: I0307 21:15:27.506311 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-version/cluster-version-operator-745944c6b7-fjbl4"] Mar 07 21:15:27.551553 master-0 kubenswrapper[7689]: I0307 21:15:27.551474 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4"] Mar 07 21:15:27.551793 master-0 kubenswrapper[7689]: E0307 21:15:27.551754 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:15:27.551793 master-0 kubenswrapper[7689]: I0307 21:15:27.551774 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:15:27.551937 master-0 kubenswrapper[7689]: I0307 21:15:27.551895 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:15:27.553233 master-0 kubenswrapper[7689]: I0307 21:15:27.553185 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.557579 master-0 kubenswrapper[7689]: I0307 21:15:27.557527 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 21:15:27.558043 master-0 kubenswrapper[7689]: I0307 21:15:27.558009 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 21:15:27.560372 master-0 kubenswrapper[7689]: I0307 21:15:27.559747 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 21:15:27.671506 master-0 kubenswrapper[7689]: I0307 21:15:27.671435 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.671759 master-0 kubenswrapper[7689]: I0307 21:15:27.671662 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.671875 master-0 kubenswrapper[7689]: I0307 21:15:27.671822 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.671948 master-0 kubenswrapper[7689]: I0307 21:15:27.671890 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.672506 master-0 kubenswrapper[7689]: I0307 21:15:27.672447 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.773542 master-0 kubenswrapper[7689]: I0307 21:15:27.773360 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.773542 master-0 kubenswrapper[7689]: I0307 21:15:27.773441 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.773542 master-0 kubenswrapper[7689]: I0307 21:15:27.773508 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.773542 master-0 kubenswrapper[7689]: I0307 21:15:27.773533 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.773912 master-0 kubenswrapper[7689]: I0307 21:15:27.773670 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.774168 master-0 kubenswrapper[7689]: I0307 21:15:27.774130 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.774221 master-0 kubenswrapper[7689]: I0307 21:15:27.774186 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.781797 master-0 kubenswrapper[7689]: I0307 21:15:27.779282 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.782453 master-0 kubenswrapper[7689]: I0307 21:15:27.782369 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.802006 master-0 kubenswrapper[7689]: I0307 21:15:27.801929 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:27.877642 master-0 kubenswrapper[7689]: I0307 21:15:27.877574 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:15:28.477434 master-0 kubenswrapper[7689]: I0307 21:15:28.477345 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" event={"ID":"96cfa9d3-fc26-42e9-8bac-ff2c25223654","Type":"ContainerStarted","Data":"a74860c7253b102381265bd05ca71aeed0e3588566e0a6daa693749f3e14d87d"} Mar 07 21:15:28.477434 master-0 kubenswrapper[7689]: I0307 21:15:28.477432 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" event={"ID":"96cfa9d3-fc26-42e9-8bac-ff2c25223654","Type":"ContainerStarted","Data":"c43f11af2c8b842060c66a8968b08b62d92a450aa814f560f58b0b7108694635"} Mar 07 21:15:28.497570 master-0 kubenswrapper[7689]: I0307 21:15:28.497447 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" podStartSLOduration=1.497417623 podStartE2EDuration="1.497417623s" podCreationTimestamp="2026-03-07 21:15:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:28.494537208 +0000 UTC m=+62.046864110" watchObservedRunningTime="2026-03-07 21:15:28.497417623 +0000 UTC m=+62.049744525" Mar 07 21:15:28.694339 master-0 kubenswrapper[7689]: I0307 21:15:28.694266 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" path="/var/lib/kubelet/pods/3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b/volumes" Mar 07 21:15:29.455729 master-0 kubenswrapper[7689]: I0307 21:15:29.455603 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_2b95a709-faec-4d50-8742-935bddd84cbc/installer/0.log" Mar 07 21:15:29.455729 master-0 kubenswrapper[7689]: I0307 21:15:29.455675 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:15:29.483413 master-0 kubenswrapper[7689]: I0307 21:15:29.483350 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-1-master-0_2b95a709-faec-4d50-8742-935bddd84cbc/installer/0.log" Mar 07 21:15:29.484146 master-0 kubenswrapper[7689]: I0307 21:15:29.483410 7689 generic.go:334] "Generic (PLEG): container finished" podID="2b95a709-faec-4d50-8742-935bddd84cbc" containerID="bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476" exitCode=1 Mar 07 21:15:29.484146 master-0 kubenswrapper[7689]: I0307 21:15:29.483455 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-1-master-0" Mar 07 21:15:29.484146 master-0 kubenswrapper[7689]: I0307 21:15:29.483449 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"2b95a709-faec-4d50-8742-935bddd84cbc","Type":"ContainerDied","Data":"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476"} Mar 07 21:15:29.484146 master-0 kubenswrapper[7689]: I0307 21:15:29.483597 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-1-master-0" event={"ID":"2b95a709-faec-4d50-8742-935bddd84cbc","Type":"ContainerDied","Data":"fdbfce6137a81a2ff42bfd72e38c7518b24c4c85ede18658ee46b97ef6f69012"} Mar 07 21:15:29.484146 master-0 kubenswrapper[7689]: I0307 21:15:29.483621 7689 scope.go:117] "RemoveContainer" containerID="bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476" Mar 07 21:15:29.499080 master-0 kubenswrapper[7689]: I0307 21:15:29.499021 7689 scope.go:117] "RemoveContainer" containerID="bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476" Mar 07 21:15:29.499420 master-0 kubenswrapper[7689]: E0307 21:15:29.499363 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476\": container with ID starting with bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476 not found: ID does not exist" containerID="bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476" Mar 07 21:15:29.499420 master-0 kubenswrapper[7689]: I0307 21:15:29.499402 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476"} err="failed to get container status \"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476\": rpc error: code = NotFound desc = could not find container \"bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476\": container with ID starting with bdceb353baeb003ee5739a050667909b334f7b1aa3f9c12a52cedb23917da476 not found: ID does not exist" Mar 07 21:15:29.567322 master-0 kubenswrapper[7689]: I0307 21:15:29.567250 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:29.567631 master-0 kubenswrapper[7689]: I0307 21:15:29.567500 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/installer-3-master-0" podUID="e969ada0-e795-4802-a212-aeabf75de371" containerName="installer" containerID="cri-o://e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b" gracePeriod=30 Mar 07 21:15:29.604604 master-0 kubenswrapper[7689]: I0307 21:15:29.604483 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir\") pod \"2b95a709-faec-4d50-8742-935bddd84cbc\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " Mar 07 21:15:29.604980 master-0 kubenswrapper[7689]: I0307 21:15:29.604640 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock\") pod \"2b95a709-faec-4d50-8742-935bddd84cbc\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " Mar 07 21:15:29.604980 master-0 kubenswrapper[7689]: I0307 21:15:29.604673 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access\") pod \"2b95a709-faec-4d50-8742-935bddd84cbc\" (UID: \"2b95a709-faec-4d50-8742-935bddd84cbc\") " Mar 07 21:15:29.604980 master-0 kubenswrapper[7689]: I0307 21:15:29.604803 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2b95a709-faec-4d50-8742-935bddd84cbc" (UID: "2b95a709-faec-4d50-8742-935bddd84cbc"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:29.604980 master-0 kubenswrapper[7689]: I0307 21:15:29.604894 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock" (OuterVolumeSpecName: "var-lock") pod "2b95a709-faec-4d50-8742-935bddd84cbc" (UID: "2b95a709-faec-4d50-8742-935bddd84cbc"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:29.605277 master-0 kubenswrapper[7689]: I0307 21:15:29.605241 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:29.605403 master-0 kubenswrapper[7689]: I0307 21:15:29.605367 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2b95a709-faec-4d50-8742-935bddd84cbc-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:29.609109 master-0 kubenswrapper[7689]: I0307 21:15:29.609061 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2b95a709-faec-4d50-8742-935bddd84cbc" (UID: "2b95a709-faec-4d50-8742-935bddd84cbc"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:29.707418 master-0 kubenswrapper[7689]: I0307 21:15:29.707324 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2b95a709-faec-4d50-8742-935bddd84cbc-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:30.261670 master-0 kubenswrapper[7689]: I0307 21:15:30.261464 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_e969ada0-e795-4802-a212-aeabf75de371/installer/0.log" Mar 07 21:15:30.261670 master-0 kubenswrapper[7689]: I0307 21:15:30.261539 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316273 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock\") pod \"e969ada0-e795-4802-a212-aeabf75de371\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316359 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access\") pod \"e969ada0-e795-4802-a212-aeabf75de371\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316400 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir\") pod \"e969ada0-e795-4802-a212-aeabf75de371\" (UID: \"e969ada0-e795-4802-a212-aeabf75de371\") " Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316423 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock" (OuterVolumeSpecName: "var-lock") pod "e969ada0-e795-4802-a212-aeabf75de371" (UID: "e969ada0-e795-4802-a212-aeabf75de371"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316606 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e969ada0-e795-4802-a212-aeabf75de371" (UID: "e969ada0-e795-4802-a212-aeabf75de371"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316809 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:30.316877 master-0 kubenswrapper[7689]: I0307 21:15:30.316835 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e969ada0-e795-4802-a212-aeabf75de371-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:30.322085 master-0 kubenswrapper[7689]: I0307 21:15:30.322025 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e969ada0-e795-4802-a212-aeabf75de371" (UID: "e969ada0-e795-4802-a212-aeabf75de371"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:30.418954 master-0 kubenswrapper[7689]: I0307 21:15:30.418841 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e969ada0-e795-4802-a212-aeabf75de371-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493545 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-3-master-0_e969ada0-e795-4802-a212-aeabf75de371/installer/0.log" Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493657 7689 generic.go:334] "Generic (PLEG): container finished" podID="e969ada0-e795-4802-a212-aeabf75de371" containerID="e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b" exitCode=1 Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493773 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-3-master-0" Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493760 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e969ada0-e795-4802-a212-aeabf75de371","Type":"ContainerDied","Data":"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b"} Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493855 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-3-master-0" event={"ID":"e969ada0-e795-4802-a212-aeabf75de371","Type":"ContainerDied","Data":"74a7e4f4531cb3594a45acf0ff68e982eea538a5e2a9de26ff5bfc8c54f201b3"} Mar 07 21:15:30.493942 master-0 kubenswrapper[7689]: I0307 21:15:30.493894 7689 scope.go:117] "RemoveContainer" containerID="e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b" Mar 07 21:15:30.513323 master-0 kubenswrapper[7689]: I0307 21:15:30.513274 7689 scope.go:117] "RemoveContainer" containerID="e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b" Mar 07 21:15:30.514079 master-0 kubenswrapper[7689]: E0307 21:15:30.513988 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b\": container with ID starting with e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b not found: ID does not exist" containerID="e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b" Mar 07 21:15:30.514190 master-0 kubenswrapper[7689]: I0307 21:15:30.514096 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b"} err="failed to get container status \"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b\": rpc error: code = NotFound desc = could not find container \"e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b\": container with ID starting with e56015f2610d9a74fd0dfd22d81c3c1cae2135bf07a1bd49c383eb5ce457062b not found: ID does not exist" Mar 07 21:15:30.587117 master-0 kubenswrapper[7689]: I0307 21:15:30.587018 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:15:30.589174 master-0 kubenswrapper[7689]: I0307 21:15:30.589124 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hm77f" Mar 07 21:15:30.919984 master-0 kubenswrapper[7689]: I0307 21:15:30.919877 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:15:30.920329 master-0 kubenswrapper[7689]: I0307 21:15:30.920234 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/installer-1-master-0" podUID="576e332a-c381-4582-bb5e-02d32bb376a4" containerName="installer" containerID="cri-o://03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c" gracePeriod=30 Mar 07 21:15:31.005354 master-0 kubenswrapper[7689]: I0307 21:15:31.005289 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-1-master-0"] Mar 07 21:15:32.409912 master-0 kubenswrapper[7689]: I0307 21:15:32.409795 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: E0307 21:15:32.410288 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b95a709-faec-4d50-8742-935bddd84cbc" containerName="installer" Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: I0307 21:15:32.410326 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b95a709-faec-4d50-8742-935bddd84cbc" containerName="installer" Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: E0307 21:15:32.410353 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e969ada0-e795-4802-a212-aeabf75de371" containerName="installer" Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: I0307 21:15:32.410373 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e969ada0-e795-4802-a212-aeabf75de371" containerName="installer" Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: I0307 21:15:32.410637 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b95a709-faec-4d50-8742-935bddd84cbc" containerName="installer" Mar 07 21:15:32.410793 master-0 kubenswrapper[7689]: I0307 21:15:32.410674 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e969ada0-e795-4802-a212-aeabf75de371" containerName="installer" Mar 07 21:15:32.411583 master-0 kubenswrapper[7689]: I0307 21:15:32.411531 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.413498 master-0 kubenswrapper[7689]: I0307 21:15:32.413406 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:32.414891 master-0 kubenswrapper[7689]: I0307 21:15:32.414846 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 07 21:15:32.415509 master-0 kubenswrapper[7689]: I0307 21:15:32.415465 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-8cblb" Mar 07 21:15:32.563653 master-0 kubenswrapper[7689]: I0307 21:15:32.563560 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.563653 master-0 kubenswrapper[7689]: I0307 21:15:32.563637 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.564021 master-0 kubenswrapper[7689]: I0307 21:15:32.563727 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.609762 master-0 kubenswrapper[7689]: I0307 21:15:32.609652 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 07 21:15:32.617254 master-0 kubenswrapper[7689]: I0307 21:15:32.617192 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/installer-3-master-0"] Mar 07 21:15:32.669153 master-0 kubenswrapper[7689]: I0307 21:15:32.668517 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.669153 master-0 kubenswrapper[7689]: I0307 21:15:32.668728 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.669153 master-0 kubenswrapper[7689]: I0307 21:15:32.668763 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.669153 master-0 kubenswrapper[7689]: I0307 21:15:32.668868 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.669510 master-0 kubenswrapper[7689]: I0307 21:15:32.669301 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.689874 master-0 kubenswrapper[7689]: I0307 21:15:32.689740 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b95a709-faec-4d50-8742-935bddd84cbc" path="/var/lib/kubelet/pods/2b95a709-faec-4d50-8742-935bddd84cbc/volumes" Mar 07 21:15:32.690486 master-0 kubenswrapper[7689]: I0307 21:15:32.690439 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e969ada0-e795-4802-a212-aeabf75de371" path="/var/lib/kubelet/pods/e969ada0-e795-4802-a212-aeabf75de371/volumes" Mar 07 21:15:32.833568 master-0 kubenswrapper[7689]: I0307 21:15:32.833499 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access\") pod \"installer-4-master-0\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:32.973055 master-0 kubenswrapper[7689]: I0307 21:15:32.972886 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:32.973055 master-0 kubenswrapper[7689]: I0307 21:15:32.972971 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:15:32.973395 master-0 kubenswrapper[7689]: I0307 21:15:32.973268 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:15:32.974205 master-0 kubenswrapper[7689]: I0307 21:15:32.974160 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:32.974465 master-0 kubenswrapper[7689]: I0307 21:15:32.974419 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:15:32.974565 master-0 kubenswrapper[7689]: I0307 21:15:32.974508 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:32.975021 master-0 kubenswrapper[7689]: I0307 21:15:32.974974 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:15:32.978632 master-0 kubenswrapper[7689]: I0307 21:15:32.978574 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:15:32.978779 master-0 kubenswrapper[7689]: I0307 21:15:32.978732 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:32.980077 master-0 kubenswrapper[7689]: I0307 21:15:32.980027 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:15:32.980164 master-0 kubenswrapper[7689]: I0307 21:15:32.980113 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:32.981003 master-0 kubenswrapper[7689]: I0307 21:15:32.980949 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:32.982167 master-0 kubenswrapper[7689]: I0307 21:15:32.982099 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:15:32.983274 master-0 kubenswrapper[7689]: I0307 21:15:32.983193 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:15:33.045066 master-0 kubenswrapper[7689]: I0307 21:15:33.044824 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:15:33.206872 master-0 kubenswrapper[7689]: I0307 21:15:33.206791 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:33.207245 master-0 kubenswrapper[7689]: I0307 21:15:33.206866 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:33.207245 master-0 kubenswrapper[7689]: I0307 21:15:33.207143 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:15:33.207445 master-0 kubenswrapper[7689]: I0307 21:15:33.206889 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:15:33.207445 master-0 kubenswrapper[7689]: I0307 21:15:33.207413 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:33.214964 master-0 kubenswrapper[7689]: I0307 21:15:33.214750 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:15:33.215174 master-0 kubenswrapper[7689]: I0307 21:15:33.215046 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:15:33.555633 master-0 kubenswrapper[7689]: I0307 21:15:33.555585 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-master-0"] Mar 07 21:15:33.714043 master-0 kubenswrapper[7689]: I0307 21:15:33.713973 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg"] Mar 07 21:15:33.724978 master-0 kubenswrapper[7689]: W0307 21:15:33.724914 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc392945_53ad_473c_8803_70e2026712d2.slice/crio-27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017 WatchSource:0}: Error finding container 27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017: Status 404 returned error can't find the container with id 27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017 Mar 07 21:15:33.814770 master-0 kubenswrapper[7689]: I0307 21:15:33.814651 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f"] Mar 07 21:15:33.823795 master-0 kubenswrapper[7689]: I0307 21:15:33.823159 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh"] Mar 07 21:15:33.835166 master-0 kubenswrapper[7689]: I0307 21:15:33.835128 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 07 21:15:33.837274 master-0 kubenswrapper[7689]: I0307 21:15:33.835914 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.846982 master-0 kubenswrapper[7689]: I0307 21:15:33.845281 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-xlnsg" Mar 07 21:15:33.846982 master-0 kubenswrapper[7689]: W0307 21:15:33.846969 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69851821_e1fc_44a8_98df_0cfe9d564126.slice/crio-34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16 WatchSource:0}: Error finding container 34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16: Status 404 returned error can't find the container with id 34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16 Mar 07 21:15:33.857181 master-0 kubenswrapper[7689]: I0307 21:15:33.857106 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 07 21:15:33.863363 master-0 kubenswrapper[7689]: I0307 21:15:33.863320 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft"] Mar 07 21:15:33.896254 master-0 kubenswrapper[7689]: I0307 21:15:33.896046 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:15:33.897198 master-0 kubenswrapper[7689]: I0307 21:15:33.897150 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.897349 master-0 kubenswrapper[7689]: I0307 21:15:33.897316 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.897451 master-0 kubenswrapper[7689]: I0307 21:15:33.897423 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.906162 master-0 kubenswrapper[7689]: W0307 21:15:33.906101 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982319eb_2dc2_4faa_85d8_ee11840179fd.slice/crio-54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25 WatchSource:0}: Error finding container 54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25: Status 404 returned error can't find the container with id 54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25 Mar 07 21:15:33.968946 master-0 kubenswrapper[7689]: I0307 21:15:33.968884 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x"] Mar 07 21:15:33.976218 master-0 kubenswrapper[7689]: I0307 21:15:33.976181 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-l2bdp"] Mar 07 21:15:33.977022 master-0 kubenswrapper[7689]: W0307 21:15:33.976946 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7b89e6e3_1fe4_4ada_a5ca_0d7b2ae16149.slice/crio-6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee WatchSource:0}: Error finding container 6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee: Status 404 returned error can't find the container with id 6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee Mar 07 21:15:33.993428 master-0 kubenswrapper[7689]: W0307 21:15:33.993373 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd310b71_6c79_4169_8b8a_7b3fe35a97fd.slice/crio-59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4 WatchSource:0}: Error finding container 59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4: Status 404 returned error can't find the container with id 59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4 Mar 07 21:15:33.998766 master-0 kubenswrapper[7689]: I0307 21:15:33.998718 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.998832 master-0 kubenswrapper[7689]: I0307 21:15:33.998811 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.998895 master-0 kubenswrapper[7689]: I0307 21:15:33.998873 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.998954 master-0 kubenswrapper[7689]: I0307 21:15:33.998923 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:33.999071 master-0 kubenswrapper[7689]: I0307 21:15:33.999044 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:34.017846 master-0 kubenswrapper[7689]: I0307 21:15:34.017805 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access\") pod \"installer-2-master-0\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:34.047596 master-0 kubenswrapper[7689]: I0307 21:15:34.047551 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:15:34.471707 master-0 kubenswrapper[7689]: I0307 21:15:34.471625 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-2-master-0"] Mar 07 21:15:34.480988 master-0 kubenswrapper[7689]: W0307 21:15:34.480927 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podbc5c4a14_0fdc_4c09_abda_7a2277a20c54.slice/crio-ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d WatchSource:0}: Error finding container ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d: Status 404 returned error can't find the container with id ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d Mar 07 21:15:34.521646 master-0 kubenswrapper[7689]: I0307 21:15:34.520160 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" event={"ID":"a9d64cd1-bd5b-4fbc-972b-000a03c854fe","Type":"ContainerStarted","Data":"5731d94226d26524a88cd0e1f020f55306937afa54c19184462a51a135d32f71"} Mar 07 21:15:34.522232 master-0 kubenswrapper[7689]: I0307 21:15:34.522171 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"bc5c4a14-0fdc-4c09-abda-7a2277a20c54","Type":"ContainerStarted","Data":"ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d"} Mar 07 21:15:34.523275 master-0 kubenswrapper[7689]: I0307 21:15:34.523165 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" event={"ID":"69851821-e1fc-44a8-98df-0cfe9d564126","Type":"ContainerStarted","Data":"34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16"} Mar 07 21:15:34.524445 master-0 kubenswrapper[7689]: I0307 21:15:34.524247 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerStarted","Data":"54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25"} Mar 07 21:15:34.525571 master-0 kubenswrapper[7689]: I0307 21:15:34.525525 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2bdp" event={"ID":"dd310b71-6c79-4169-8b8a-7b3fe35a97fd","Type":"ContainerStarted","Data":"59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4"} Mar 07 21:15:34.529027 master-0 kubenswrapper[7689]: I0307 21:15:34.528877 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" event={"ID":"e720291b-0f96-4ebb-80f2-5df7cb194ffc","Type":"ContainerStarted","Data":"830157826e05e2de3fde03b0efe550c5f62b4477f3c1f398d0aceaab2ecf408c"} Mar 07 21:15:34.529126 master-0 kubenswrapper[7689]: I0307 21:15:34.529027 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" event={"ID":"e720291b-0f96-4ebb-80f2-5df7cb194ffc","Type":"ContainerStarted","Data":"88596b62ed73d1cc0a657006e38bdd5646ef2e8ca1da1e67945f77115c8e4249"} Mar 07 21:15:34.530778 master-0 kubenswrapper[7689]: I0307 21:15:34.530726 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ddc814a4-b865-4a35-b5f8-f54af449fe25","Type":"ContainerStarted","Data":"f84fa34d05ad67aec62ca362c7866be59185619297d89ccd25b8d12c9a739a50"} Mar 07 21:15:34.530907 master-0 kubenswrapper[7689]: I0307 21:15:34.530790 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ddc814a4-b865-4a35-b5f8-f54af449fe25","Type":"ContainerStarted","Data":"463d8bfc31fe475b18975fa1110d938e01959c570bcc75066d9a8d30bafab290"} Mar 07 21:15:34.532733 master-0 kubenswrapper[7689]: I0307 21:15:34.532702 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" event={"ID":"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149","Type":"ContainerStarted","Data":"6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee"} Mar 07 21:15:34.535078 master-0 kubenswrapper[7689]: I0307 21:15:34.535049 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" event={"ID":"fc392945-53ad-473c-8803-70e2026712d2","Type":"ContainerStarted","Data":"27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017"} Mar 07 21:15:34.553477 master-0 kubenswrapper[7689]: I0307 21:15:34.553355 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-master-0" podStartSLOduration=3.553315442 podStartE2EDuration="3.553315442s" podCreationTimestamp="2026-03-07 21:15:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:34.55094474 +0000 UTC m=+68.103271672" watchObservedRunningTime="2026-03-07 21:15:34.553315442 +0000 UTC m=+68.105642324" Mar 07 21:15:35.544160 master-0 kubenswrapper[7689]: I0307 21:15:35.544087 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"bc5c4a14-0fdc-4c09-abda-7a2277a20c54","Type":"ContainerStarted","Data":"f731d58484b6e995b134d609352f74f3a18338de0be2a0cddb04f00bff760ac6"} Mar 07 21:15:36.719134 master-0 kubenswrapper[7689]: I0307 21:15:36.719034 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-2-master-0" podStartSLOduration=3.719012878 podStartE2EDuration="3.719012878s" podCreationTimestamp="2026-03-07 21:15:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:35.562216097 +0000 UTC m=+69.114542999" watchObservedRunningTime="2026-03-07 21:15:36.719012878 +0000 UTC m=+70.271339770" Mar 07 21:15:36.863292 master-0 kubenswrapper[7689]: I0307 21:15:36.861865 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:36.863292 master-0 kubenswrapper[7689]: I0307 21:15:36.862327 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" podUID="83377bd5-67a6-4108-b3ac-a3d338813fc1" containerName="controller-manager" containerID="cri-o://cf850f29efe8ab776668f6d57f50250ae977e92fba1ea0538e7ff3a1dfc8b3ac" gracePeriod=30 Mar 07 21:15:36.877558 master-0 kubenswrapper[7689]: I0307 21:15:36.877250 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:36.878163 master-0 kubenswrapper[7689]: I0307 21:15:36.878113 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" podUID="a9837e9f-f72d-44b7-8f75-abe00884bff6" containerName="route-controller-manager" containerID="cri-o://24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38" gracePeriod=30 Mar 07 21:15:37.550530 master-0 kubenswrapper[7689]: I0307 21:15:37.550476 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:37.565289 master-0 kubenswrapper[7689]: I0307 21:15:37.565249 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" event={"ID":"fc392945-53ad-473c-8803-70e2026712d2","Type":"ContainerStarted","Data":"d4b7300644150fe23cfc59508105971a56a432a4d87f592adbcc874823ecb22d"} Mar 07 21:15:37.565580 master-0 kubenswrapper[7689]: I0307 21:15:37.565564 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:37.568580 master-0 kubenswrapper[7689]: I0307 21:15:37.568559 7689 patch_prober.go:28] interesting pod/marketplace-operator-64bf9778cb-q7hrg container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.128.0.8:8080/healthz\": dial tcp 10.128.0.8:8080: connect: connection refused" start-of-body= Mar 07 21:15:37.568761 master-0 kubenswrapper[7689]: I0307 21:15:37.568731 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" podUID="fc392945-53ad-473c-8803-70e2026712d2" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.128.0.8:8080/healthz\": dial tcp 10.128.0.8:8080: connect: connection refused" Mar 07 21:15:37.569601 master-0 kubenswrapper[7689]: I0307 21:15:37.569580 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" event={"ID":"a9d64cd1-bd5b-4fbc-972b-000a03c854fe","Type":"ContainerStarted","Data":"02176d71a2dda9e217cf731139b14e2c510faae6fe6f033712592455c5329ddf"} Mar 07 21:15:37.586562 master-0 kubenswrapper[7689]: I0307 21:15:37.586388 7689 generic.go:334] "Generic (PLEG): container finished" podID="83377bd5-67a6-4108-b3ac-a3d338813fc1" containerID="cf850f29efe8ab776668f6d57f50250ae977e92fba1ea0538e7ff3a1dfc8b3ac" exitCode=0 Mar 07 21:15:37.587087 master-0 kubenswrapper[7689]: I0307 21:15:37.587029 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" event={"ID":"83377bd5-67a6-4108-b3ac-a3d338813fc1","Type":"ContainerDied","Data":"cf850f29efe8ab776668f6d57f50250ae977e92fba1ea0538e7ff3a1dfc8b3ac"} Mar 07 21:15:37.629357 master-0 kubenswrapper[7689]: I0307 21:15:37.629293 7689 generic.go:334] "Generic (PLEG): container finished" podID="a9837e9f-f72d-44b7-8f75-abe00884bff6" containerID="24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38" exitCode=0 Mar 07 21:15:37.629357 master-0 kubenswrapper[7689]: I0307 21:15:37.629354 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" event={"ID":"a9837e9f-f72d-44b7-8f75-abe00884bff6","Type":"ContainerDied","Data":"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38"} Mar 07 21:15:37.629551 master-0 kubenswrapper[7689]: I0307 21:15:37.629359 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" Mar 07 21:15:37.629551 master-0 kubenswrapper[7689]: I0307 21:15:37.629385 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx" event={"ID":"a9837e9f-f72d-44b7-8f75-abe00884bff6","Type":"ContainerDied","Data":"b0389161dec8c01cd7d1f79a8e7299ae47e99a77f1756c782ea9e577c8f4185b"} Mar 07 21:15:37.629551 master-0 kubenswrapper[7689]: I0307 21:15:37.629407 7689 scope.go:117] "RemoveContainer" containerID="24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38" Mar 07 21:15:37.651255 master-0 kubenswrapper[7689]: I0307 21:15:37.651220 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:37.655863 master-0 kubenswrapper[7689]: I0307 21:15:37.654592 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca\") pod \"a9837e9f-f72d-44b7-8f75-abe00884bff6\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " Mar 07 21:15:37.655863 master-0 kubenswrapper[7689]: I0307 21:15:37.654634 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert\") pod \"a9837e9f-f72d-44b7-8f75-abe00884bff6\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " Mar 07 21:15:37.655863 master-0 kubenswrapper[7689]: I0307 21:15:37.654704 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cjp4\" (UniqueName: \"kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4\") pod \"a9837e9f-f72d-44b7-8f75-abe00884bff6\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " Mar 07 21:15:37.655863 master-0 kubenswrapper[7689]: I0307 21:15:37.654755 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config\") pod \"a9837e9f-f72d-44b7-8f75-abe00884bff6\" (UID: \"a9837e9f-f72d-44b7-8f75-abe00884bff6\") " Mar 07 21:15:37.658026 master-0 kubenswrapper[7689]: I0307 21:15:37.657998 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca" (OuterVolumeSpecName: "client-ca") pod "a9837e9f-f72d-44b7-8f75-abe00884bff6" (UID: "a9837e9f-f72d-44b7-8f75-abe00884bff6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:37.659506 master-0 kubenswrapper[7689]: I0307 21:15:37.659312 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config" (OuterVolumeSpecName: "config") pod "a9837e9f-f72d-44b7-8f75-abe00884bff6" (UID: "a9837e9f-f72d-44b7-8f75-abe00884bff6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:37.662917 master-0 kubenswrapper[7689]: I0307 21:15:37.661665 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4" (OuterVolumeSpecName: "kube-api-access-2cjp4") pod "a9837e9f-f72d-44b7-8f75-abe00884bff6" (UID: "a9837e9f-f72d-44b7-8f75-abe00884bff6"). InnerVolumeSpecName "kube-api-access-2cjp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:37.670737 master-0 kubenswrapper[7689]: I0307 21:15:37.670694 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a9837e9f-f72d-44b7-8f75-abe00884bff6" (UID: "a9837e9f-f72d-44b7-8f75-abe00884bff6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:37.713151 master-0 kubenswrapper[7689]: I0307 21:15:37.712258 7689 scope.go:117] "RemoveContainer" containerID="24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38" Mar 07 21:15:37.713151 master-0 kubenswrapper[7689]: E0307 21:15:37.712662 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38\": container with ID starting with 24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38 not found: ID does not exist" containerID="24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38" Mar 07 21:15:37.713151 master-0 kubenswrapper[7689]: I0307 21:15:37.712789 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38"} err="failed to get container status \"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38\": rpc error: code = NotFound desc = could not find container \"24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38\": container with ID starting with 24bf442b8fd50847668a920049128f1bd4f45e50d27004a73a889b4468915f38 not found: ID does not exist" Mar 07 21:15:37.756498 master-0 kubenswrapper[7689]: I0307 21:15:37.756444 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config\") pod \"83377bd5-67a6-4108-b3ac-a3d338813fc1\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " Mar 07 21:15:37.756953 master-0 kubenswrapper[7689]: I0307 21:15:37.756614 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca\") pod \"83377bd5-67a6-4108-b3ac-a3d338813fc1\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " Mar 07 21:15:37.756953 master-0 kubenswrapper[7689]: I0307 21:15:37.756675 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles\") pod \"83377bd5-67a6-4108-b3ac-a3d338813fc1\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " Mar 07 21:15:37.756953 master-0 kubenswrapper[7689]: I0307 21:15:37.756737 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-982pp\" (UniqueName: \"kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp\") pod \"83377bd5-67a6-4108-b3ac-a3d338813fc1\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " Mar 07 21:15:37.756953 master-0 kubenswrapper[7689]: I0307 21:15:37.756780 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert\") pod \"83377bd5-67a6-4108-b3ac-a3d338813fc1\" (UID: \"83377bd5-67a6-4108-b3ac-a3d338813fc1\") " Mar 07 21:15:37.757103 master-0 kubenswrapper[7689]: I0307 21:15:37.757052 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca" (OuterVolumeSpecName: "client-ca") pod "83377bd5-67a6-4108-b3ac-a3d338813fc1" (UID: "83377bd5-67a6-4108-b3ac-a3d338813fc1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:37.757537 master-0 kubenswrapper[7689]: I0307 21:15:37.757507 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cjp4\" (UniqueName: \"kubernetes.io/projected/a9837e9f-f72d-44b7-8f75-abe00884bff6-kube-api-access-2cjp4\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.757915 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config" (OuterVolumeSpecName: "config") pod "83377bd5-67a6-4108-b3ac-a3d338813fc1" (UID: "83377bd5-67a6-4108-b3ac-a3d338813fc1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.757959 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.757973 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.757982 7689 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a9837e9f-f72d-44b7-8f75-abe00884bff6-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.757991 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a9837e9f-f72d-44b7-8f75-abe00884bff6-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.760105 master-0 kubenswrapper[7689]: I0307 21:15:37.758025 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "83377bd5-67a6-4108-b3ac-a3d338813fc1" (UID: "83377bd5-67a6-4108-b3ac-a3d338813fc1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:15:37.770918 master-0 kubenswrapper[7689]: E0307 21:15:37.770423 7689 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b339e6a_cae6_416a_963b_2fd23cecba96.slice/crio-conmon-4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:15:37.794125 master-0 kubenswrapper[7689]: I0307 21:15:37.793945 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp" (OuterVolumeSpecName: "kube-api-access-982pp") pod "83377bd5-67a6-4108-b3ac-a3d338813fc1" (UID: "83377bd5-67a6-4108-b3ac-a3d338813fc1"). InnerVolumeSpecName "kube-api-access-982pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:37.794939 master-0 kubenswrapper[7689]: I0307 21:15:37.794859 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "83377bd5-67a6-4108-b3ac-a3d338813fc1" (UID: "83377bd5-67a6-4108-b3ac-a3d338813fc1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:15:37.859435 master-0 kubenswrapper[7689]: I0307 21:15:37.859375 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-982pp\" (UniqueName: \"kubernetes.io/projected/83377bd5-67a6-4108-b3ac-a3d338813fc1-kube-api-access-982pp\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.859435 master-0 kubenswrapper[7689]: I0307 21:15:37.859437 7689 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.859630 master-0 kubenswrapper[7689]: I0307 21:15:37.859457 7689 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/83377bd5-67a6-4108-b3ac-a3d338813fc1-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.859630 master-0 kubenswrapper[7689]: I0307 21:15:37.859478 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/83377bd5-67a6-4108-b3ac-a3d338813fc1-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:37.986972 master-0 kubenswrapper[7689]: I0307 21:15:37.986924 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:37.992515 master-0 kubenswrapper[7689]: I0307 21:15:37.990623 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7df7f5b8c-5rhtx"] Mar 07 21:15:38.370078 master-0 kubenswrapper[7689]: I0307 21:15:38.369988 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:15:38.370300 master-0 kubenswrapper[7689]: E0307 21:15:38.370240 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9837e9f-f72d-44b7-8f75-abe00884bff6" containerName="route-controller-manager" Mar 07 21:15:38.370300 master-0 kubenswrapper[7689]: I0307 21:15:38.370259 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9837e9f-f72d-44b7-8f75-abe00884bff6" containerName="route-controller-manager" Mar 07 21:15:38.370300 master-0 kubenswrapper[7689]: E0307 21:15:38.370275 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83377bd5-67a6-4108-b3ac-a3d338813fc1" containerName="controller-manager" Mar 07 21:15:38.370300 master-0 kubenswrapper[7689]: I0307 21:15:38.370285 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="83377bd5-67a6-4108-b3ac-a3d338813fc1" containerName="controller-manager" Mar 07 21:15:38.370456 master-0 kubenswrapper[7689]: I0307 21:15:38.370370 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="83377bd5-67a6-4108-b3ac-a3d338813fc1" containerName="controller-manager" Mar 07 21:15:38.370456 master-0 kubenswrapper[7689]: I0307 21:15:38.370386 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9837e9f-f72d-44b7-8f75-abe00884bff6" containerName="route-controller-manager" Mar 07 21:15:38.370662 master-0 kubenswrapper[7689]: I0307 21:15:38.370628 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:15:38.371062 master-0 kubenswrapper[7689]: I0307 21:15:38.371024 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.371594 master-0 kubenswrapper[7689]: I0307 21:15:38.371538 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.376328 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.376767 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.377046 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l888p" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.377509 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.377612 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h286h" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.377718 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:15:38.382063 master-0 kubenswrapper[7689]: I0307 21:15:38.379491 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:15:38.389195 master-0 kubenswrapper[7689]: I0307 21:15:38.389157 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:15:38.392781 master-0 kubenswrapper[7689]: I0307 21:15:38.392749 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:15:38.468553 master-0 kubenswrapper[7689]: I0307 21:15:38.468387 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468580 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468620 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468648 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468696 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468759 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.468798 master-0 kubenswrapper[7689]: I0307 21:15:38.468781 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.468973 master-0 kubenswrapper[7689]: I0307 21:15:38.468806 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.469114 master-0 kubenswrapper[7689]: I0307 21:15:38.468831 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.570967 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571218 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571244 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571268 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571295 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571313 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571448 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571479 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.571224 master-0 kubenswrapper[7689]: I0307 21:15:38.571966 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.572868 master-0 kubenswrapper[7689]: I0307 21:15:38.572607 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.572997 master-0 kubenswrapper[7689]: I0307 21:15:38.572934 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.573085 master-0 kubenswrapper[7689]: I0307 21:15:38.573062 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.574143 master-0 kubenswrapper[7689]: I0307 21:15:38.574062 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.575028 master-0 kubenswrapper[7689]: I0307 21:15:38.574982 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.575343 master-0 kubenswrapper[7689]: I0307 21:15:38.575303 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.578529 master-0 kubenswrapper[7689]: I0307 21:15:38.578486 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.602386 master-0 kubenswrapper[7689]: I0307 21:15:38.602300 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.610522 master-0 kubenswrapper[7689]: I0307 21:15:38.610428 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.664730 master-0 kubenswrapper[7689]: I0307 21:15:38.664594 7689 generic.go:334] "Generic (PLEG): container finished" podID="5b339e6a-cae6-416a-963b-2fd23cecba96" containerID="4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91" exitCode=0 Mar 07 21:15:38.664950 master-0 kubenswrapper[7689]: I0307 21:15:38.664737 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerDied","Data":"4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91"} Mar 07 21:15:38.666146 master-0 kubenswrapper[7689]: I0307 21:15:38.666112 7689 scope.go:117] "RemoveContainer" containerID="4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91" Mar 07 21:15:38.666962 master-0 kubenswrapper[7689]: I0307 21:15:38.666930 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" Mar 07 21:15:38.667077 master-0 kubenswrapper[7689]: I0307 21:15:38.666943 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-64655dcbb9-bp4zn" event={"ID":"83377bd5-67a6-4108-b3ac-a3d338813fc1","Type":"ContainerDied","Data":"c8d2c0237170ab58ef6b21b5bf0bf18e7f397acaddfef52805db630c8eaf1d53"} Mar 07 21:15:38.667144 master-0 kubenswrapper[7689]: I0307 21:15:38.667115 7689 scope.go:117] "RemoveContainer" containerID="cf850f29efe8ab776668f6d57f50250ae977e92fba1ea0538e7ff3a1dfc8b3ac" Mar 07 21:15:38.671700 master-0 kubenswrapper[7689]: I0307 21:15:38.671650 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerStarted","Data":"d0c8f910f29b908238dbc63bf9ac7b0f87a9546eaf7538fe52110d4fc58afa92"} Mar 07 21:15:38.671794 master-0 kubenswrapper[7689]: I0307 21:15:38.671734 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerStarted","Data":"3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6"} Mar 07 21:15:38.676191 master-0 kubenswrapper[7689]: I0307 21:15:38.676147 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2bdp" event={"ID":"dd310b71-6c79-4169-8b8a-7b3fe35a97fd","Type":"ContainerStarted","Data":"ff2da7c12a8ba38eb3f5409c438b34180161e6aa8c5abc889b6e2c851219fbb9"} Mar 07 21:15:38.676256 master-0 kubenswrapper[7689]: I0307 21:15:38.676203 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-l2bdp" event={"ID":"dd310b71-6c79-4169-8b8a-7b3fe35a97fd","Type":"ContainerStarted","Data":"e483a8b4869353ae6becbd99b74246870373caefb073977c11c65dede44dc6ea"} Mar 07 21:15:38.681545 master-0 kubenswrapper[7689]: I0307 21:15:38.681474 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:15:38.698983 master-0 kubenswrapper[7689]: I0307 21:15:38.698912 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:38.704758 master-0 kubenswrapper[7689]: I0307 21:15:38.704650 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9837e9f-f72d-44b7-8f75-abe00884bff6" path="/var/lib/kubelet/pods/a9837e9f-f72d-44b7-8f75-abe00884bff6/volumes" Mar 07 21:15:38.720393 master-0 kubenswrapper[7689]: I0307 21:15:38.720323 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:38.858355 master-0 kubenswrapper[7689]: I0307 21:15:38.858293 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:38.868502 master-0 kubenswrapper[7689]: I0307 21:15:38.867529 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-64655dcbb9-bp4zn"] Mar 07 21:15:39.172116 master-0 kubenswrapper[7689]: I0307 21:15:39.172025 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:15:39.243500 master-0 kubenswrapper[7689]: I0307 21:15:39.243285 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:15:39.702322 master-0 kubenswrapper[7689]: I0307 21:15:39.702237 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" event={"ID":"6deed9a9-6702-4177-a35d-58ad9930a893","Type":"ContainerStarted","Data":"ad261fabb7ddabed91944dcee1de6f4489253aa6b0b8c94f1078f8b07e107a86"} Mar 07 21:15:39.702322 master-0 kubenswrapper[7689]: I0307 21:15:39.702301 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" event={"ID":"6deed9a9-6702-4177-a35d-58ad9930a893","Type":"ContainerStarted","Data":"9a3242defcab78a5704c3ac516165c6355f42a0842d58543e6938dbfa54c0dc4"} Mar 07 21:15:39.703509 master-0 kubenswrapper[7689]: I0307 21:15:39.703477 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:39.705005 master-0 kubenswrapper[7689]: I0307 21:15:39.704974 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" event={"ID":"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907","Type":"ContainerStarted","Data":"e28998f60449f58259c0cdb625118f7b6c9387b10abccbf9ef475bb39dbd3f74"} Mar 07 21:15:39.705005 master-0 kubenswrapper[7689]: I0307 21:15:39.705000 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" event={"ID":"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907","Type":"ContainerStarted","Data":"59fb206093956750cd2b0971ba9daf6182e197e8af3331245cd46cb229bb1de1"} Mar 07 21:15:39.705495 master-0 kubenswrapper[7689]: I0307 21:15:39.705450 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:39.710547 master-0 kubenswrapper[7689]: I0307 21:15:39.710178 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerStarted","Data":"462d721f7750425af90d3f273635e726bcc5aa1beb2ca22700d6eca8c4a03024"} Mar 07 21:15:39.722601 master-0 kubenswrapper[7689]: I0307 21:15:39.722555 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:15:39.727274 master-0 kubenswrapper[7689]: I0307 21:15:39.726497 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" podStartSLOduration=3.726485384 podStartE2EDuration="3.726485384s" podCreationTimestamp="2026-03-07 21:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:39.72557579 +0000 UTC m=+73.277902682" watchObservedRunningTime="2026-03-07 21:15:39.726485384 +0000 UTC m=+73.278812276" Mar 07 21:15:39.773131 master-0 kubenswrapper[7689]: I0307 21:15:39.770778 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" podStartSLOduration=3.770748737 podStartE2EDuration="3.770748737s" podCreationTimestamp="2026-03-07 21:15:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:15:39.769021512 +0000 UTC m=+73.321348404" watchObservedRunningTime="2026-03-07 21:15:39.770748737 +0000 UTC m=+73.323075629" Mar 07 21:15:40.261237 master-0 kubenswrapper[7689]: I0307 21:15:40.260142 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:15:40.356036 master-0 kubenswrapper[7689]: I0307 21:15:40.355973 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz"] Mar 07 21:15:40.356836 master-0 kubenswrapper[7689]: I0307 21:15:40.356812 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.370978 master-0 kubenswrapper[7689]: I0307 21:15:40.363357 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 21:15:40.370978 master-0 kubenswrapper[7689]: I0307 21:15:40.363641 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-v8df8" Mar 07 21:15:40.370978 master-0 kubenswrapper[7689]: I0307 21:15:40.364424 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz"] Mar 07 21:15:40.399028 master-0 kubenswrapper[7689]: I0307 21:15:40.398965 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.399028 master-0 kubenswrapper[7689]: I0307 21:15:40.399023 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp45l\" (UniqueName: \"kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.500756 master-0 kubenswrapper[7689]: I0307 21:15:40.500647 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.501128 master-0 kubenswrapper[7689]: I0307 21:15:40.500744 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp45l\" (UniqueName: \"kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.506569 master-0 kubenswrapper[7689]: I0307 21:15:40.506518 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.519507 master-0 kubenswrapper[7689]: I0307 21:15:40.519402 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp45l\" (UniqueName: \"kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.685789 master-0 kubenswrapper[7689]: I0307 21:15:40.685725 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:15:40.692012 master-0 kubenswrapper[7689]: I0307 21:15:40.691964 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83377bd5-67a6-4108-b3ac-a3d338813fc1" path="/var/lib/kubelet/pods/83377bd5-67a6-4108-b3ac-a3d338813fc1/volumes" Mar 07 21:15:42.723537 master-0 kubenswrapper[7689]: I0307 21:15:42.723445 7689 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 07 21:15:42.724132 master-0 kubenswrapper[7689]: I0307 21:15:42.723907 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" containerID="cri-o://a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08" gracePeriod=30 Mar 07 21:15:42.724132 master-0 kubenswrapper[7689]: I0307 21:15:42.724016 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0-master-0" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" containerID="cri-o://8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe" gracePeriod=30 Mar 07 21:15:42.725646 master-0 kubenswrapper[7689]: I0307 21:15:42.725600 7689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:15:42.725918 master-0 kubenswrapper[7689]: E0307 21:15:42.725869 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 07 21:15:42.725918 master-0 kubenswrapper[7689]: I0307 21:15:42.725893 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 07 21:15:42.725918 master-0 kubenswrapper[7689]: E0307 21:15:42.725918 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 07 21:15:42.726042 master-0 kubenswrapper[7689]: I0307 21:15:42.725924 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 07 21:15:42.726042 master-0 kubenswrapper[7689]: I0307 21:15:42.726012 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcd" Mar 07 21:15:42.726042 master-0 kubenswrapper[7689]: I0307 21:15:42.726024 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="354f29997baa583b6238f7de9108ee10" containerName="etcdctl" Mar 07 21:15:42.735861 master-0 kubenswrapper[7689]: I0307 21:15:42.735317 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845024 master-0 kubenswrapper[7689]: I0307 21:15:42.844931 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845275 master-0 kubenswrapper[7689]: I0307 21:15:42.845044 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845275 master-0 kubenswrapper[7689]: I0307 21:15:42.845204 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845275 master-0 kubenswrapper[7689]: I0307 21:15:42.845233 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845482 master-0 kubenswrapper[7689]: I0307 21:15:42.845375 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.845542 master-0 kubenswrapper[7689]: I0307 21:15:42.845516 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.946953 master-0 kubenswrapper[7689]: I0307 21:15:42.946813 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947196 master-0 kubenswrapper[7689]: I0307 21:15:42.946967 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947196 master-0 kubenswrapper[7689]: I0307 21:15:42.946984 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947196 master-0 kubenswrapper[7689]: I0307 21:15:42.946904 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947275 master-0 kubenswrapper[7689]: I0307 21:15:42.947177 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947275 master-0 kubenswrapper[7689]: I0307 21:15:42.947208 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947275 master-0 kubenswrapper[7689]: I0307 21:15:42.947241 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947409 master-0 kubenswrapper[7689]: I0307 21:15:42.947347 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947541 master-0 kubenswrapper[7689]: I0307 21:15:42.947513 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947610 master-0 kubenswrapper[7689]: I0307 21:15:42.947589 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947610 master-0 kubenswrapper[7689]: I0307 21:15:42.947601 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:42.947673 master-0 kubenswrapper[7689]: I0307 21:15:42.947630 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:15:43.772432 master-0 kubenswrapper[7689]: I0307 21:15:43.772353 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" event={"ID":"69851821-e1fc-44a8-98df-0cfe9d564126","Type":"ContainerStarted","Data":"7aaed8a833b3068593d26b6804ec3a006285f7a402c4ef65546ea1c84ea6ae4d"} Mar 07 21:15:43.773172 master-0 kubenswrapper[7689]: I0307 21:15:43.772710 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:43.775235 master-0 kubenswrapper[7689]: I0307 21:15:43.775176 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" event={"ID":"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149","Type":"ContainerStarted","Data":"6690322ef152ddb1743025780f4e212cb381fc5357beb0407cc2777292df2c5a"} Mar 07 21:15:43.775658 master-0 kubenswrapper[7689]: I0307 21:15:43.775598 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:43.777758 master-0 kubenswrapper[7689]: I0307 21:15:43.777702 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" event={"ID":"e720291b-0f96-4ebb-80f2-5df7cb194ffc","Type":"ContainerStarted","Data":"768e8856043fbb67b776885a9d2a7eceeb5d345ca9e38c33950ec9b98b1495c0"} Mar 07 21:15:43.777973 master-0 kubenswrapper[7689]: I0307 21:15:43.777937 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:15:43.781284 master-0 kubenswrapper[7689]: I0307 21:15:43.781257 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:15:43.782860 master-0 kubenswrapper[7689]: I0307 21:15:43.782822 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:15:46.801134 master-0 kubenswrapper[7689]: I0307 21:15:46.800982 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-98wdp_bd633b72-3d0b-4601-a2c2-3f487d943b35/openshift-controller-manager-operator/0.log" Mar 07 21:15:46.801134 master-0 kubenswrapper[7689]: I0307 21:15:46.801076 7689 generic.go:334] "Generic (PLEG): container finished" podID="bd633b72-3d0b-4601-a2c2-3f487d943b35" containerID="8db5d27113ab5fae894c6cc0107da033c6196250dc7c341eeb4aaf2ff2d3a924" exitCode=1 Mar 07 21:15:46.801134 master-0 kubenswrapper[7689]: I0307 21:15:46.801127 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" event={"ID":"bd633b72-3d0b-4601-a2c2-3f487d943b35","Type":"ContainerDied","Data":"8db5d27113ab5fae894c6cc0107da033c6196250dc7c341eeb4aaf2ff2d3a924"} Mar 07 21:15:46.801920 master-0 kubenswrapper[7689]: I0307 21:15:46.801872 7689 scope.go:117] "RemoveContainer" containerID="8db5d27113ab5fae894c6cc0107da033c6196250dc7c341eeb4aaf2ff2d3a924" Mar 07 21:15:47.811764 master-0 kubenswrapper[7689]: I0307 21:15:47.811358 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-98wdp_bd633b72-3d0b-4601-a2c2-3f487d943b35/openshift-controller-manager-operator/0.log" Mar 07 21:15:47.811764 master-0 kubenswrapper[7689]: I0307 21:15:47.811447 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" event={"ID":"bd633b72-3d0b-4601-a2c2-3f487d943b35","Type":"ContainerStarted","Data":"ccd9f245273657c0bc2487d6a294da692f89e6ed1128b54cfcb593029b69f33b"} Mar 07 21:15:49.729847 master-0 kubenswrapper[7689]: I0307 21:15:49.727219 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_576e332a-c381-4582-bb5e-02d32bb376a4/installer/0.log" Mar 07 21:15:49.729847 master-0 kubenswrapper[7689]: I0307 21:15:49.727292 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:49.827368 master-0 kubenswrapper[7689]: I0307 21:15:49.827293 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-1-master-0_576e332a-c381-4582-bb5e-02d32bb376a4/installer/0.log" Mar 07 21:15:49.827821 master-0 kubenswrapper[7689]: I0307 21:15:49.827385 7689 generic.go:334] "Generic (PLEG): container finished" podID="576e332a-c381-4582-bb5e-02d32bb376a4" containerID="03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c" exitCode=1 Mar 07 21:15:49.827821 master-0 kubenswrapper[7689]: I0307 21:15:49.827435 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"576e332a-c381-4582-bb5e-02d32bb376a4","Type":"ContainerDied","Data":"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c"} Mar 07 21:15:49.827821 master-0 kubenswrapper[7689]: I0307 21:15:49.827496 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-1-master-0" event={"ID":"576e332a-c381-4582-bb5e-02d32bb376a4","Type":"ContainerDied","Data":"27583492499e035b40b8f072f078cc77a6db2ea1938b124426f193c21478705d"} Mar 07 21:15:49.827821 master-0 kubenswrapper[7689]: I0307 21:15:49.827537 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-1-master-0" Mar 07 21:15:49.827821 master-0 kubenswrapper[7689]: I0307 21:15:49.827539 7689 scope.go:117] "RemoveContainer" containerID="03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c" Mar 07 21:15:49.848195 master-0 kubenswrapper[7689]: I0307 21:15:49.848109 7689 scope.go:117] "RemoveContainer" containerID="03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c" Mar 07 21:15:49.849070 master-0 kubenswrapper[7689]: E0307 21:15:49.849012 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c\": container with ID starting with 03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c not found: ID does not exist" containerID="03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c" Mar 07 21:15:49.849264 master-0 kubenswrapper[7689]: I0307 21:15:49.849066 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c"} err="failed to get container status \"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c\": rpc error: code = NotFound desc = could not find container \"03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c\": container with ID starting with 03f0077984dc99b6a3ff39fcf87570de92b41dae7ed3fe73438832e248ed7a6c not found: ID does not exist" Mar 07 21:15:49.864967 master-0 kubenswrapper[7689]: I0307 21:15:49.864881 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access\") pod \"576e332a-c381-4582-bb5e-02d32bb376a4\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " Mar 07 21:15:49.865162 master-0 kubenswrapper[7689]: I0307 21:15:49.865015 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock\") pod \"576e332a-c381-4582-bb5e-02d32bb376a4\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " Mar 07 21:15:49.865162 master-0 kubenswrapper[7689]: I0307 21:15:49.865091 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir\") pod \"576e332a-c381-4582-bb5e-02d32bb376a4\" (UID: \"576e332a-c381-4582-bb5e-02d32bb376a4\") " Mar 07 21:15:49.865255 master-0 kubenswrapper[7689]: I0307 21:15:49.865191 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock" (OuterVolumeSpecName: "var-lock") pod "576e332a-c381-4582-bb5e-02d32bb376a4" (UID: "576e332a-c381-4582-bb5e-02d32bb376a4"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:49.865372 master-0 kubenswrapper[7689]: I0307 21:15:49.865328 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "576e332a-c381-4582-bb5e-02d32bb376a4" (UID: "576e332a-c381-4582-bb5e-02d32bb376a4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:49.865459 master-0 kubenswrapper[7689]: I0307 21:15:49.865427 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:49.865498 master-0 kubenswrapper[7689]: I0307 21:15:49.865456 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/576e332a-c381-4582-bb5e-02d32bb376a4-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:49.869423 master-0 kubenswrapper[7689]: I0307 21:15:49.869357 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "576e332a-c381-4582-bb5e-02d32bb376a4" (UID: "576e332a-c381-4582-bb5e-02d32bb376a4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:49.968361 master-0 kubenswrapper[7689]: I0307 21:15:49.968099 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/576e332a-c381-4582-bb5e-02d32bb376a4-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:55.787318 master-0 kubenswrapper[7689]: E0307 21:15:55.787212 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:15:55.788227 master-0 kubenswrapper[7689]: I0307 21:15:55.788126 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:15:55.815880 master-0 kubenswrapper[7689]: W0307 21:15:55.815800 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e52bef89f4b50e4590a1719bcc5d7e5.slice/crio-c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6 WatchSource:0}: Error finding container c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6: Status 404 returned error can't find the container with id c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6 Mar 07 21:15:55.871891 master-0 kubenswrapper[7689]: I0307 21:15:55.871617 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6"} Mar 07 21:15:56.824239 master-0 kubenswrapper[7689]: I0307 21:15:56.824122 7689 patch_prober.go:28] interesting pod/etcd-operator-5884b9cd56-lc94h container/etcd-operator namespace/openshift-etcd-operator: Liveness probe status=failure output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" start-of-body= Mar 07 21:15:56.824239 master-0 kubenswrapper[7689]: I0307 21:15:56.824221 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" podUID="5f82d4aa-0cb5-477f-944e-745a21d124fc" containerName="etcd-operator" probeResult="failure" output="Get \"https://10.128.0.24:8443/healthz\": dial tcp 10.128.0.24:8443: connect: connection refused" Mar 07 21:15:56.884360 master-0 kubenswrapper[7689]: I0307 21:15:56.884283 7689 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51" exitCode=1 Mar 07 21:15:56.884535 master-0 kubenswrapper[7689]: I0307 21:15:56.884394 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51"} Mar 07 21:15:56.884535 master-0 kubenswrapper[7689]: I0307 21:15:56.884445 7689 scope.go:117] "RemoveContainer" containerID="14697f7165ea16496d207a527c3a0eec6d705bfe290e2065971615387572920a" Mar 07 21:15:56.885638 master-0 kubenswrapper[7689]: I0307 21:15:56.885576 7689 scope.go:117] "RemoveContainer" containerID="1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51" Mar 07 21:15:56.888346 master-0 kubenswrapper[7689]: I0307 21:15:56.888251 7689 generic.go:334] "Generic (PLEG): container finished" podID="e757a93e-91aa-4fce-949b-4c51a060528e" containerID="a049a3a4077135fa9e02b1a9804eac864bd6874b0847dc250b8650ce1e94ce1d" exitCode=0 Mar 07 21:15:56.888516 master-0 kubenswrapper[7689]: I0307 21:15:56.888336 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"e757a93e-91aa-4fce-949b-4c51a060528e","Type":"ContainerDied","Data":"a049a3a4077135fa9e02b1a9804eac864bd6874b0847dc250b8650ce1e94ce1d"} Mar 07 21:15:56.891015 master-0 kubenswrapper[7689]: I0307 21:15:56.890929 7689 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" exitCode=0 Mar 07 21:15:56.891015 master-0 kubenswrapper[7689]: I0307 21:15:56.890960 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317"} Mar 07 21:15:57.013606 master-0 kubenswrapper[7689]: I0307 21:15:57.013519 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:15:57.053112 master-0 kubenswrapper[7689]: I0307 21:15:57.053023 7689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:15:57.903456 master-0 kubenswrapper[7689]: I0307 21:15:57.903311 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e"} Mar 07 21:15:58.200161 master-0 kubenswrapper[7689]: I0307 21:15:58.200101 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:58.302742 master-0 kubenswrapper[7689]: I0307 21:15:58.302586 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access\") pod \"e757a93e-91aa-4fce-949b-4c51a060528e\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " Mar 07 21:15:58.302742 master-0 kubenswrapper[7689]: I0307 21:15:58.302748 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock\") pod \"e757a93e-91aa-4fce-949b-4c51a060528e\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " Mar 07 21:15:58.303165 master-0 kubenswrapper[7689]: I0307 21:15:58.302831 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir\") pod \"e757a93e-91aa-4fce-949b-4c51a060528e\" (UID: \"e757a93e-91aa-4fce-949b-4c51a060528e\") " Mar 07 21:15:58.303165 master-0 kubenswrapper[7689]: I0307 21:15:58.303025 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock" (OuterVolumeSpecName: "var-lock") pod "e757a93e-91aa-4fce-949b-4c51a060528e" (UID: "e757a93e-91aa-4fce-949b-4c51a060528e"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:58.303165 master-0 kubenswrapper[7689]: I0307 21:15:58.303112 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e757a93e-91aa-4fce-949b-4c51a060528e" (UID: "e757a93e-91aa-4fce-949b-4c51a060528e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:15:58.303645 master-0 kubenswrapper[7689]: I0307 21:15:58.303587 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:58.303776 master-0 kubenswrapper[7689]: I0307 21:15:58.303642 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e757a93e-91aa-4fce-949b-4c51a060528e-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:58.308279 master-0 kubenswrapper[7689]: I0307 21:15:58.308202 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e757a93e-91aa-4fce-949b-4c51a060528e" (UID: "e757a93e-91aa-4fce-949b-4c51a060528e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:15:58.405187 master-0 kubenswrapper[7689]: I0307 21:15:58.405095 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e757a93e-91aa-4fce-949b-4c51a060528e-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:15:58.552237 master-0 kubenswrapper[7689]: E0307 21:15:58.552007 7689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:15:58.912271 master-0 kubenswrapper[7689]: I0307 21:15:58.912188 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 07 21:15:58.912271 master-0 kubenswrapper[7689]: I0307 21:15:58.912186 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-1-master-0" event={"ID":"e757a93e-91aa-4fce-949b-4c51a060528e","Type":"ContainerDied","Data":"34c367e3b7cd662a238cd3cf60724c5f41e1100b6bc750255dda8f40be5bf92e"} Mar 07 21:15:58.912271 master-0 kubenswrapper[7689]: I0307 21:15:58.912276 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c367e3b7cd662a238cd3cf60724c5f41e1100b6bc750255dda8f40be5bf92e" Mar 07 21:15:59.926262 master-0 kubenswrapper[7689]: I0307 21:15:59.926156 7689 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" exitCode=1 Mar 07 21:15:59.926262 master-0 kubenswrapper[7689]: I0307 21:15:59.926251 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da"} Mar 07 21:15:59.927215 master-0 kubenswrapper[7689]: I0307 21:15:59.926969 7689 scope.go:117] "RemoveContainer" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" Mar 07 21:16:00.935830 master-0 kubenswrapper[7689]: I0307 21:16:00.935655 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7"} Mar 07 21:16:04.336516 master-0 kubenswrapper[7689]: I0307 21:16:04.336398 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:16:05.967997 master-0 kubenswrapper[7689]: I0307 21:16:05.967792 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_34e734b7-82d6-493d-ace8-1945b2c08c6d/installer/0.log" Mar 07 21:16:05.967997 master-0 kubenswrapper[7689]: I0307 21:16:05.967880 7689 generic.go:334] "Generic (PLEG): container finished" podID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerID="b18addaef135e00fefdd51e68b734344679afa8f4606f39797d35e107db0fa22" exitCode=1 Mar 07 21:16:05.967997 master-0 kubenswrapper[7689]: I0307 21:16:05.967925 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"34e734b7-82d6-493d-ace8-1945b2c08c6d","Type":"ContainerDied","Data":"b18addaef135e00fefdd51e68b734344679afa8f4606f39797d35e107db0fa22"} Mar 07 21:16:07.013539 master-0 kubenswrapper[7689]: I0307 21:16:07.013435 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:16:07.301789 master-0 kubenswrapper[7689]: I0307 21:16:07.301720 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_34e734b7-82d6-493d-ace8-1945b2c08c6d/installer/0.log" Mar 07 21:16:07.302005 master-0 kubenswrapper[7689]: I0307 21:16:07.301845 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:16:07.337635 master-0 kubenswrapper[7689]: I0307 21:16:07.337521 7689 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:07.440077 master-0 kubenswrapper[7689]: I0307 21:16:07.439943 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access\") pod \"34e734b7-82d6-493d-ace8-1945b2c08c6d\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " Mar 07 21:16:07.440386 master-0 kubenswrapper[7689]: I0307 21:16:07.440131 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock\") pod \"34e734b7-82d6-493d-ace8-1945b2c08c6d\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " Mar 07 21:16:07.440386 master-0 kubenswrapper[7689]: I0307 21:16:07.440187 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir\") pod \"34e734b7-82d6-493d-ace8-1945b2c08c6d\" (UID: \"34e734b7-82d6-493d-ace8-1945b2c08c6d\") " Mar 07 21:16:07.440535 master-0 kubenswrapper[7689]: I0307 21:16:07.440376 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock" (OuterVolumeSpecName: "var-lock") pod "34e734b7-82d6-493d-ace8-1945b2c08c6d" (UID: "34e734b7-82d6-493d-ace8-1945b2c08c6d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:07.440535 master-0 kubenswrapper[7689]: I0307 21:16:07.440503 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "34e734b7-82d6-493d-ace8-1945b2c08c6d" (UID: "34e734b7-82d6-493d-ace8-1945b2c08c6d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:07.441092 master-0 kubenswrapper[7689]: I0307 21:16:07.441033 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:07.441092 master-0 kubenswrapper[7689]: I0307 21:16:07.441074 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/34e734b7-82d6-493d-ace8-1945b2c08c6d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:07.445290 master-0 kubenswrapper[7689]: I0307 21:16:07.445174 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "34e734b7-82d6-493d-ace8-1945b2c08c6d" (UID: "34e734b7-82d6-493d-ace8-1945b2c08c6d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:16:07.543044 master-0 kubenswrapper[7689]: I0307 21:16:07.542820 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/34e734b7-82d6-493d-ace8-1945b2c08c6d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:07.984953 master-0 kubenswrapper[7689]: I0307 21:16:07.984842 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_34e734b7-82d6-493d-ace8-1945b2c08c6d/installer/0.log" Mar 07 21:16:07.984953 master-0 kubenswrapper[7689]: I0307 21:16:07.984953 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-master-0" event={"ID":"34e734b7-82d6-493d-ace8-1945b2c08c6d","Type":"ContainerDied","Data":"6de23860b0b81dd71d1a71f02b3b23b5ac8368494a9752dfe36eb798dc3827b1"} Mar 07 21:16:07.985324 master-0 kubenswrapper[7689]: I0307 21:16:07.984996 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6de23860b0b81dd71d1a71f02b3b23b5ac8368494a9752dfe36eb798dc3827b1" Mar 07 21:16:07.985324 master-0 kubenswrapper[7689]: I0307 21:16:07.985057 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:16:08.553389 master-0 kubenswrapper[7689]: E0307 21:16:08.552757 7689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:08.605204 master-0 kubenswrapper[7689]: I0307 21:16:08.605110 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:08.605459 master-0 kubenswrapper[7689]: I0307 21:16:08.605217 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:08.605459 master-0 kubenswrapper[7689]: I0307 21:16:08.605121 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:08.605459 master-0 kubenswrapper[7689]: I0307 21:16:08.605326 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:09.900312 master-0 kubenswrapper[7689]: E0307 21:16:09.900270 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:10.000348 master-0 kubenswrapper[7689]: I0307 21:16:10.000287 7689 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe" exitCode=0 Mar 07 21:16:11.010384 master-0 kubenswrapper[7689]: I0307 21:16:11.010298 7689 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" exitCode=0 Mar 07 21:16:11.010384 master-0 kubenswrapper[7689]: I0307 21:16:11.010368 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2"} Mar 07 21:16:11.605273 master-0 kubenswrapper[7689]: I0307 21:16:11.605167 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:11.605273 master-0 kubenswrapper[7689]: I0307 21:16:11.605264 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:11.605655 master-0 kubenswrapper[7689]: I0307 21:16:11.605204 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:11.605655 master-0 kubenswrapper[7689]: I0307 21:16:11.605410 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:12.883086 master-0 kubenswrapper[7689]: I0307 21:16:12.882871 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 07 21:16:12.883086 master-0 kubenswrapper[7689]: I0307 21:16:12.882990 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:16:13.021102 master-0 kubenswrapper[7689]: I0307 21:16:13.020922 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 07 21:16:13.021102 master-0 kubenswrapper[7689]: I0307 21:16:13.021002 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") pod \"354f29997baa583b6238f7de9108ee10\" (UID: \"354f29997baa583b6238f7de9108ee10\") " Mar 07 21:16:13.021444 master-0 kubenswrapper[7689]: I0307 21:16:13.021138 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir" (OuterVolumeSpecName: "data-dir") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:13.021444 master-0 kubenswrapper[7689]: I0307 21:16:13.021188 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs" (OuterVolumeSpecName: "certs") pod "354f29997baa583b6238f7de9108ee10" (UID: "354f29997baa583b6238f7de9108ee10"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:13.021623 master-0 kubenswrapper[7689]: I0307 21:16:13.021566 7689 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:13.021623 master-0 kubenswrapper[7689]: I0307 21:16:13.021609 7689 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/354f29997baa583b6238f7de9108ee10-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:13.023129 master-0 kubenswrapper[7689]: I0307 21:16:13.023078 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0-master-0_354f29997baa583b6238f7de9108ee10/etcdctl/0.log" Mar 07 21:16:13.023238 master-0 kubenswrapper[7689]: I0307 21:16:13.023138 7689 generic.go:334] "Generic (PLEG): container finished" podID="354f29997baa583b6238f7de9108ee10" containerID="a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08" exitCode=137 Mar 07 21:16:13.023238 master-0 kubenswrapper[7689]: I0307 21:16:13.023198 7689 scope.go:117] "RemoveContainer" containerID="8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe" Mar 07 21:16:13.023362 master-0 kubenswrapper[7689]: I0307 21:16:13.023258 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:16:13.039912 master-0 kubenswrapper[7689]: I0307 21:16:13.039856 7689 scope.go:117] "RemoveContainer" containerID="a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08" Mar 07 21:16:13.061476 master-0 kubenswrapper[7689]: I0307 21:16:13.061415 7689 scope.go:117] "RemoveContainer" containerID="8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe" Mar 07 21:16:13.061994 master-0 kubenswrapper[7689]: E0307 21:16:13.061936 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe\": container with ID starting with 8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe not found: ID does not exist" containerID="8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe" Mar 07 21:16:13.062147 master-0 kubenswrapper[7689]: I0307 21:16:13.061990 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe"} err="failed to get container status \"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe\": rpc error: code = NotFound desc = could not find container \"8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe\": container with ID starting with 8aab245064a0c19dcd4f0e8decc317408dd2e60b2896ff7348cfc85e0242b2fe not found: ID does not exist" Mar 07 21:16:13.062147 master-0 kubenswrapper[7689]: I0307 21:16:13.062032 7689 scope.go:117] "RemoveContainer" containerID="a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08" Mar 07 21:16:13.062444 master-0 kubenswrapper[7689]: E0307 21:16:13.062384 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08\": container with ID starting with a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08 not found: ID does not exist" containerID="a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08" Mar 07 21:16:13.062444 master-0 kubenswrapper[7689]: I0307 21:16:13.062421 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08"} err="failed to get container status \"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08\": rpc error: code = NotFound desc = could not find container \"a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08\": container with ID starting with a71f7d6e201c73ded484926e6d5a47e8daebe7baf87a0a9245a62d5f85c4af08 not found: ID does not exist" Mar 07 21:16:13.216717 master-0 kubenswrapper[7689]: I0307 21:16:13.216574 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:16:14.605378 master-0 kubenswrapper[7689]: I0307 21:16:14.605287 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:14.606276 master-0 kubenswrapper[7689]: I0307 21:16:14.605382 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:14.606276 master-0 kubenswrapper[7689]: I0307 21:16:14.605459 7689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:14.606276 master-0 kubenswrapper[7689]: I0307 21:16:14.605509 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:14.606276 master-0 kubenswrapper[7689]: I0307 21:16:14.605654 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:14.606276 master-0 kubenswrapper[7689]: I0307 21:16:14.605886 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:14.606637 master-0 kubenswrapper[7689]: I0307 21:16:14.606342 7689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 07 21:16:14.606637 master-0 kubenswrapper[7689]: I0307 21:16:14.606418 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" containerID="cri-o://7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656" gracePeriod=30 Mar 07 21:16:14.696731 master-0 kubenswrapper[7689]: I0307 21:16:14.696615 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="354f29997baa583b6238f7de9108ee10" path="/var/lib/kubelet/pods/354f29997baa583b6238f7de9108ee10/volumes" Mar 07 21:16:14.697454 master-0 kubenswrapper[7689]: I0307 21:16:14.697394 7689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 07 21:16:16.757946 master-0 kubenswrapper[7689]: E0307 21:16:16.757660 7689 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0-master-0.189aaba850e866ad openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0-master-0,UID:354f29997baa583b6238f7de9108ee10,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:15:42.723974829 +0000 UTC m=+76.276301731,LastTimestamp:2026-03-07 21:15:42.723974829 +0000 UTC m=+76.276301731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:16:17.337237 master-0 kubenswrapper[7689]: I0307 21:16:17.337063 7689 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:17.605498 master-0 kubenswrapper[7689]: I0307 21:16:17.605370 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:17.605498 master-0 kubenswrapper[7689]: I0307 21:16:17.605472 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:17.778485 master-0 kubenswrapper[7689]: I0307 21:16:17.778364 7689 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-7w8wf container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 07 21:16:17.778485 master-0 kubenswrapper[7689]: I0307 21:16:17.778482 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" podUID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 07 21:16:18.554063 master-0 kubenswrapper[7689]: E0307 21:16:18.553958 7689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:19.064065 master-0 kubenswrapper[7689]: I0307 21:16:19.063850 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_ddc814a4-b865-4a35-b5f8-f54af449fe25/installer/0.log" Mar 07 21:16:19.064065 master-0 kubenswrapper[7689]: I0307 21:16:19.064054 7689 generic.go:334] "Generic (PLEG): container finished" podID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerID="f84fa34d05ad67aec62ca362c7866be59185619297d89ccd25b8d12c9a739a50" exitCode=1 Mar 07 21:16:19.093806 master-0 kubenswrapper[7689]: E0307 21:16:19.093161 7689 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:16:09Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:16:09Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:16:09Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:16:09Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2fe5144b1f72bdcf5d5a52130f02ed86fbec3875cc4ac108ead00eaac1659e06\\\"],\\\"sizeBytes\\\":487090672},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4a4c3e6ca0cd26f7eb5270cfafbcf423cf2986d152bf5b9fc6469d40599e104e\\\"],\\\"sizeBytes\\\":484450382},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c54c3f7cffe057ae0bdf26163d5e46744685083ae16fc97112e32beacd2d8955\\\"],\\\"sizeBytes\\\":484175664},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d74fe7cb12c554c120262683d9c4066f33ae4f60a5fad83cba419d851b98c12d\\\"],\\\"sizeBytes\\\":470822665},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9b8bc43bac294be3c7669cde049e388ad9d8751242051ba40f83e1c401eceda\\\"],\\\"sizeBytes\\\":468263999},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8677f7a973553c25d282bc249fc8bc0f5aa42fb144ea0956d1f04c5a6cd80501\\\"],\\\"sizeBytes\\\":465086330},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a85dab5856916220df6f05ce9d6aa10cd4fa0234093b55355246690bba05ad1\\\"],\\\"sizeBytes\\\":463700811},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b714a7ada1e295b599b432f32e1fd5b74c8cdbe6fe51e95306322b25cb873914\\\"],\\\"sizeBytes\\\":458126424},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5230462066ab36e3025524e948dd33fa6f51ee29a4f91fa469bfc268568b5fd9\\\"],\\\"sizeBytes\\\":456575686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:89cb093f319eaa04acfe9431b8697bffbc71ab670546f7ed257daa332165c626\\\"],\\\"sizeBytes\\\":448828105},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c680fcc9fd6b66099ca4c0f512521b6f8e0bc29273ddb9405730bc54bacb6783\\\"],\\\"sizeBytes\\\":448041621},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cf9670d0f269f8d49fd9ef4981999be195f6624a4146aa93d9201eb8acc81053\\\"],\\\"sizeBytes\\\":443271011},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8ceca1efee55b9fd5089428476bbc401fe73db7c0b0f5e16d4ad28ed0f0f9d43\\\"],\\\"sizeBytes\\\":438654375},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ace4dcd008420277d915fe983b07bbb50fb3ab0673f28d0166424a75bc2137e7\\\"],\\\"sizeBytes\\\":411585608},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8f0fda36e9a2040dbe0537361dcd73658df4e669d846f8101a8f9f29f0be9a7\\\"],\\\"sizeBytes\\\":407347126},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3\\\"],\\\"sizeBytes\\\":396521759}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:20.076962 master-0 kubenswrapper[7689]: I0307 21:16:20.076877 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-v4xm9_f8980370-267c-4168-ba97-d780698533ff/network-operator/0.log" Mar 07 21:16:20.076962 master-0 kubenswrapper[7689]: I0307 21:16:20.076964 7689 generic.go:334] "Generic (PLEG): container finished" podID="f8980370-267c-4168-ba97-d780698533ff" containerID="a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db" exitCode=255 Mar 07 21:16:20.605241 master-0 kubenswrapper[7689]: I0307 21:16:20.605189 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:20.605641 master-0 kubenswrapper[7689]: I0307 21:16:20.605554 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:21.085129 master-0 kubenswrapper[7689]: I0307 21:16:21.085061 7689 generic.go:334] "Generic (PLEG): container finished" podID="3faedef9-d507-48aa-82a8-f3dc9b5adeef" containerID="f737e30d954aa064b6cfef3a212e4d7f5057ece37e1afcdb2a92dd75d8adab26" exitCode=0 Mar 07 21:16:21.087081 master-0 kubenswrapper[7689]: I0307 21:16:21.087035 7689 generic.go:334] "Generic (PLEG): container finished" podID="e543d99f-e0dc-49be-95bd-c39eabd05ce8" containerID="bb9512b327c952122a6ba9c90bf697a16d6d7a153e8ba4baf488a717c15e85eb" exitCode=0 Mar 07 21:16:23.604955 master-0 kubenswrapper[7689]: I0307 21:16:23.604855 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:23.606347 master-0 kubenswrapper[7689]: I0307 21:16:23.604964 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:24.021420 master-0 kubenswrapper[7689]: E0307 21:16:24.021233 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:25.115010 master-0 kubenswrapper[7689]: I0307 21:16:25.114935 7689 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" exitCode=0 Mar 07 21:16:26.604793 master-0 kubenswrapper[7689]: I0307 21:16:26.604642 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:26.604793 master-0 kubenswrapper[7689]: I0307 21:16:26.604785 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:27.337915 master-0 kubenswrapper[7689]: I0307 21:16:27.337541 7689 prober.go:107] "Probe failed" probeType="Startup" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.32.10:10257/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:27.778943 master-0 kubenswrapper[7689]: I0307 21:16:27.778816 7689 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-7w8wf container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 07 21:16:27.778943 master-0 kubenswrapper[7689]: I0307 21:16:27.778929 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" podUID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 07 21:16:28.139107 master-0 kubenswrapper[7689]: I0307 21:16:28.138999 7689 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54" exitCode=0 Mar 07 21:16:28.141587 master-0 kubenswrapper[7689]: I0307 21:16:28.141503 7689 generic.go:334] "Generic (PLEG): container finished" podID="b88c5fbe-e19f-45b3-ab03-e1626f95776d" containerID="4dd4ab96de66a81d1a97cd72bb912ec500681a0000024a0cfaf545c2eaf36106" exitCode=0 Mar 07 21:16:28.555205 master-0 kubenswrapper[7689]: E0307 21:16:28.554970 7689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:29.095095 master-0 kubenswrapper[7689]: E0307 21:16:29.094934 7689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:29.605666 master-0 kubenswrapper[7689]: I0307 21:16:29.605542 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:29.606006 master-0 kubenswrapper[7689]: I0307 21:16:29.605667 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:30.155843 master-0 kubenswrapper[7689]: I0307 21:16:30.155629 7689 generic.go:334] "Generic (PLEG): container finished" podID="5f82d4aa-0cb5-477f-944e-745a21d124fc" containerID="42f741a1d8745f4ba4855310764e131077825a56cb2981843ca7f7c641b06c4d" exitCode=0 Mar 07 21:16:32.605346 master-0 kubenswrapper[7689]: I0307 21:16:32.605268 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:32.606605 master-0 kubenswrapper[7689]: I0307 21:16:32.606537 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:35.605492 master-0 kubenswrapper[7689]: I0307 21:16:35.605362 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:35.606298 master-0 kubenswrapper[7689]: I0307 21:16:35.605527 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:36.198429 master-0 kubenswrapper[7689]: I0307 21:16:36.198285 7689 generic.go:334] "Generic (PLEG): container finished" podID="abfb5602-7255-43d7-a510-e7f94885887e" containerID="98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429" exitCode=0 Mar 07 21:16:37.778666 master-0 kubenswrapper[7689]: I0307 21:16:37.778320 7689 patch_prober.go:28] interesting pod/authentication-operator-7c6989d6c4-7w8wf container/authentication-operator namespace/openshift-authentication-operator: Liveness probe status=failure output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" start-of-body= Mar 07 21:16:37.778666 master-0 kubenswrapper[7689]: I0307 21:16:37.778480 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" podUID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerName="authentication-operator" probeResult="failure" output="Get \"https://10.128.0.13:8443/healthz\": dial tcp 10.128.0.13:8443: connect: connection refused" Mar 07 21:16:38.124580 master-0 kubenswrapper[7689]: E0307 21:16:38.124482 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:38.555470 master-0 kubenswrapper[7689]: E0307 21:16:38.555385 7689 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:38.555470 master-0 kubenswrapper[7689]: I0307 21:16:38.555460 7689 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 21:16:38.605040 master-0 kubenswrapper[7689]: I0307 21:16:38.604949 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:38.605418 master-0 kubenswrapper[7689]: I0307 21:16:38.605080 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:39.095943 master-0 kubenswrapper[7689]: E0307 21:16:39.095835 7689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:41.635541 master-0 kubenswrapper[7689]: I0307 21:16:41.605448 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:41.635541 master-0 kubenswrapper[7689]: I0307 21:16:41.605544 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:43.275907 master-0 kubenswrapper[7689]: I0307 21:16:43.275830 7689 generic.go:334] "Generic (PLEG): container finished" podID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerID="7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656" exitCode=0 Mar 07 21:16:43.775276 master-0 kubenswrapper[7689]: I0307 21:16:43.775190 7689 status_manager.go:851] "Failed to get status for pod" podUID="69851821-e1fc-44a8-98df-0cfe9d564126" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods olm-operator-d64cfc9db-qd6xh)" Mar 07 21:16:43.841199 master-0 kubenswrapper[7689]: E0307 21:16:43.841115 7689 log.go:32] "RunPodSandbox from runtime service failed" err=< Mar 07 21:16:43.841199 master-0 kubenswrapper[7689]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c_0(e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b" Netns:"/var/run/netns/8a335742-3560-470e-9707-8b2d713ca525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dgjgz;K8S_POD_INFRA_CONTAINER_ID=e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b;K8S_POD_UID=1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dgjgz?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 07 21:16:43.841199 master-0 kubenswrapper[7689]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 07 21:16:43.841199 master-0 kubenswrapper[7689]: > Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: E0307 21:16:43.841232 7689 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err=< Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c_0(e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b" Netns:"/var/run/netns/8a335742-3560-470e-9707-8b2d713ca525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dgjgz;K8S_POD_INFRA_CONTAINER_ID=e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b;K8S_POD_UID=1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dgjgz?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: > pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: E0307 21:16:43.841266 7689 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err=< Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c_0(e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz to CNI network "multus-cni-network": plugin type="multus-shim" name="multus-cni-network" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:"e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b" Netns:"/var/run/netns/8a335742-3560-470e-9707-8b2d713ca525" IfName:"eth0" Args:"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dgjgz;K8S_POD_INFRA_CONTAINER_ID=e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b;K8S_POD_UID=1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" Path:"" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: status update failed for pod /: Get "https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dgjgz?timeout=1m0s": net/http: request canceled (Client.Timeout exceeded while awaiting headers) Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: ': StdinData: {"binDir":"/var/lib/cni/bin","clusterNetwork":"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf","cniVersion":"0.3.1","daemonSocketDir":"/run/multus/socket","globalNamespaces":"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv","logLevel":"verbose","logToStderr":true,"name":"multus-cni-network","namespaceIsolation":true,"type":"multus-shim"} Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: > pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:16:43.841506 master-0 kubenswrapper[7689]: E0307 21:16:43.841375 7689 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api(1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api(1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_control-plane-machine-set-operator-6686554ddc-dgjgz_openshift-machine-api_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c_0(e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b): error adding pod openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz to CNI network \\\"multus-cni-network\\\": plugin type=\\\"multus-shim\\\" name=\\\"multus-cni-network\\\" failed (add): CmdAdd (shim): CNI request failed with status 400: 'ContainerID:\\\"e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b\\\" Netns:\\\"/var/run/netns/8a335742-3560-470e-9707-8b2d713ca525\\\" IfName:\\\"eth0\\\" Args:\\\"IgnoreUnknown=1;K8S_POD_NAMESPACE=openshift-machine-api;K8S_POD_NAME=control-plane-machine-set-operator-6686554ddc-dgjgz;K8S_POD_INFRA_CONTAINER_ID=e26e5f12dcbcb2d223b658f0890fa17b46ab3d6fe5a85d6a3bb1810c111f416b;K8S_POD_UID=1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\\\" Path:\\\"\\\" ERRORED: error configuring pod [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz] networking: Multus: [openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c]: error setting the networks status: SetPodNetworkStatusAnnotation: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: SetNetworkStatus: failed to update the pod control-plane-machine-set-operator-6686554ddc-dgjgz in out of cluster comm: status update failed for pod /: Get \\\"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-api/pods/control-plane-machine-set-operator-6686554ddc-dgjgz?timeout=1m0s\\\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)\\n': StdinData: {\\\"binDir\\\":\\\"/var/lib/cni/bin\\\",\\\"clusterNetwork\\\":\\\"/host/run/multus/cni/net.d/10-ovn-kubernetes.conf\\\",\\\"cniVersion\\\":\\\"0.3.1\\\",\\\"daemonSocketDir\\\":\\\"/run/multus/socket\\\",\\\"globalNamespaces\\\":\\\"default,openshift-multus,openshift-sriov-network-operator,openshift-cnv\\\",\\\"logLevel\\\":\\\"verbose\\\",\\\"logToStderr\\\":true,\\\"name\\\":\\\"multus-cni-network\\\",\\\"namespaceIsolation\\\":true,\\\"type\\\":\\\"multus-shim\\\"}\"" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" podUID="1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" Mar 07 21:16:44.287419 master-0 kubenswrapper[7689]: I0307 21:16:44.287284 7689 generic.go:334] "Generic (PLEG): container finished" podID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerID="ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33" exitCode=0 Mar 07 21:16:44.290168 master-0 kubenswrapper[7689]: I0307 21:16:44.290039 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-qd6xh_69851821-e1fc-44a8-98df-0cfe9d564126/olm-operator/0.log" Mar 07 21:16:44.290168 master-0 kubenswrapper[7689]: I0307 21:16:44.290121 7689 generic.go:334] "Generic (PLEG): container finished" podID="69851821-e1fc-44a8-98df-0cfe9d564126" containerID="7aaed8a833b3068593d26b6804ec3a006285f7a402c4ef65546ea1c84ea6ae4d" exitCode=1 Mar 07 21:16:44.296552 master-0 kubenswrapper[7689]: I0307 21:16:44.296481 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/0.log" Mar 07 21:16:44.297257 master-0 kubenswrapper[7689]: I0307 21:16:44.297176 7689 generic.go:334] "Generic (PLEG): container finished" podID="27b149f7-6aff-45f3-b935-e65279f2f9ee" containerID="98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247" exitCode=1 Mar 07 21:16:44.299918 master-0 kubenswrapper[7689]: I0307 21:16:44.299853 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-j454x_7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/catalog-operator/0.log" Mar 07 21:16:44.299918 master-0 kubenswrapper[7689]: I0307 21:16:44.299910 7689 generic.go:334] "Generic (PLEG): container finished" podID="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" containerID="6690322ef152ddb1743025780f4e212cb381fc5357beb0407cc2777292df2c5a" exitCode=1 Mar 07 21:16:44.300170 master-0 kubenswrapper[7689]: I0307 21:16:44.300010 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:16:44.300953 master-0 kubenswrapper[7689]: I0307 21:16:44.300901 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:16:47.604657 master-0 kubenswrapper[7689]: I0307 21:16:47.604584 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:47.605507 master-0 kubenswrapper[7689]: I0307 21:16:47.604669 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:47.605507 master-0 kubenswrapper[7689]: I0307 21:16:47.604773 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:47.605507 master-0 kubenswrapper[7689]: I0307 21:16:47.604913 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:48.556735 master-0 kubenswrapper[7689]: E0307 21:16:48.556592 7689 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="200ms" Mar 07 21:16:48.701678 master-0 kubenswrapper[7689]: E0307 21:16:48.701587 7689 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0-master-0" Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: E0307 21:16:48.701965 7689 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="34.019s" Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702009 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ddc814a4-b865-4a35-b5f8-f54af449fe25","Type":"ContainerDied","Data":"f84fa34d05ad67aec62ca362c7866be59185619297d89ccd25b8d12c9a739a50"} Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702062 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerDied","Data":"a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db"} Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702097 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702118 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" event={"ID":"3faedef9-d507-48aa-82a8-f3dc9b5adeef","Type":"ContainerDied","Data":"f737e30d954aa064b6cfef3a212e4d7f5057ece37e1afcdb2a92dd75d8adab26"} Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702230 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" event={"ID":"e543d99f-e0dc-49be-95bd-c39eabd05ce8","Type":"ContainerDied","Data":"bb9512b327c952122a6ba9c90bf697a16d6d7a153e8ba4baf488a717c15e85eb"} Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702259 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702279 7689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:16:48.702676 master-0 kubenswrapper[7689]: I0307 21:16:48.702298 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31"} Mar 07 21:16:48.705628 master-0 kubenswrapper[7689]: I0307 21:16:48.705568 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:48.705837 master-0 kubenswrapper[7689]: I0307 21:16:48.705637 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:48.706095 master-0 kubenswrapper[7689]: I0307 21:16:48.706053 7689 scope.go:117] "RemoveContainer" containerID="ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33" Mar 07 21:16:48.706845 master-0 kubenswrapper[7689]: I0307 21:16:48.706806 7689 scope.go:117] "RemoveContainer" containerID="a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db" Mar 07 21:16:48.707333 master-0 kubenswrapper[7689]: I0307 21:16:48.707285 7689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e"} pod="kube-system/bootstrap-kube-controller-manager-master-0" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 07 21:16:48.707484 master-0 kubenswrapper[7689]: I0307 21:16:48.707347 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e" gracePeriod=30 Mar 07 21:16:48.709820 master-0 kubenswrapper[7689]: I0307 21:16:48.709767 7689 scope.go:117] "RemoveContainer" containerID="f737e30d954aa064b6cfef3a212e4d7f5057ece37e1afcdb2a92dd75d8adab26" Mar 07 21:16:48.710918 master-0 kubenswrapper[7689]: I0307 21:16:48.710034 7689 scope.go:117] "RemoveContainer" containerID="bb9512b327c952122a6ba9c90bf697a16d6d7a153e8ba4baf488a717c15e85eb" Mar 07 21:16:48.730885 master-0 kubenswrapper[7689]: I0307 21:16:48.728947 7689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 07 21:16:49.097104 master-0 kubenswrapper[7689]: E0307 21:16:49.097025 7689 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:16:49.357588 master-0 kubenswrapper[7689]: I0307 21:16:49.357509 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-v4xm9_f8980370-267c-4168-ba97-d780698533ff/network-operator/0.log" Mar 07 21:16:49.362444 master-0 kubenswrapper[7689]: I0307 21:16:49.362361 7689 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e" exitCode=2 Mar 07 21:16:49.741194 master-0 kubenswrapper[7689]: I0307 21:16:49.741131 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_ddc814a4-b865-4a35-b5f8-f54af449fe25/installer/0.log" Mar 07 21:16:49.742031 master-0 kubenswrapper[7689]: I0307 21:16:49.741267 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:16:49.874575 master-0 kubenswrapper[7689]: I0307 21:16:49.874483 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access\") pod \"ddc814a4-b865-4a35-b5f8-f54af449fe25\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " Mar 07 21:16:49.874966 master-0 kubenswrapper[7689]: I0307 21:16:49.874660 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock\") pod \"ddc814a4-b865-4a35-b5f8-f54af449fe25\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " Mar 07 21:16:49.874966 master-0 kubenswrapper[7689]: I0307 21:16:49.874732 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir\") pod \"ddc814a4-b865-4a35-b5f8-f54af449fe25\" (UID: \"ddc814a4-b865-4a35-b5f8-f54af449fe25\") " Mar 07 21:16:49.875119 master-0 kubenswrapper[7689]: I0307 21:16:49.874895 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock" (OuterVolumeSpecName: "var-lock") pod "ddc814a4-b865-4a35-b5f8-f54af449fe25" (UID: "ddc814a4-b865-4a35-b5f8-f54af449fe25"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:49.875119 master-0 kubenswrapper[7689]: I0307 21:16:49.875058 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "ddc814a4-b865-4a35-b5f8-f54af449fe25" (UID: "ddc814a4-b865-4a35-b5f8-f54af449fe25"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:49.875441 master-0 kubenswrapper[7689]: I0307 21:16:49.875325 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:49.875441 master-0 kubenswrapper[7689]: I0307 21:16:49.875369 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/ddc814a4-b865-4a35-b5f8-f54af449fe25-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:49.880085 master-0 kubenswrapper[7689]: I0307 21:16:49.880013 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "ddc814a4-b865-4a35-b5f8-f54af449fe25" (UID: "ddc814a4-b865-4a35-b5f8-f54af449fe25"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:16:49.976913 master-0 kubenswrapper[7689]: I0307 21:16:49.976451 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ddc814a4-b865-4a35-b5f8-f54af449fe25-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:50.378818 master-0 kubenswrapper[7689]: I0307 21:16:50.377634 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_ddc814a4-b865-4a35-b5f8-f54af449fe25/installer/0.log" Mar 07 21:16:50.378818 master-0 kubenswrapper[7689]: I0307 21:16:50.377827 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:16:50.606229 master-0 kubenswrapper[7689]: I0307 21:16:50.605824 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:50.606229 master-0 kubenswrapper[7689]: I0307 21:16:50.606089 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:50.606229 master-0 kubenswrapper[7689]: I0307 21:16:50.606156 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:50.606927 master-0 kubenswrapper[7689]: I0307 21:16:50.606301 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:50.761005 master-0 kubenswrapper[7689]: E0307 21:16:50.760711 7689 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{package-server-manager-854648ff6d-kr9ft.189aaba869ceace3 openshift-operator-lifecycle-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-operator-lifecycle-manager,Name:package-server-manager-854648ff6d-kr9ft,UID:e720291b-0f96-4ebb-80f2-5df7cb194ffc,APIVersion:v1,ResourceVersion:3564,FieldPath:spec.containers{package-server-manager},},Reason:Pulled,Message:Successfully pulled image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\" in 9.009s (9.009s including waiting). Image size: 862633255 bytes.,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:15:43.141719267 +0000 UTC m=+76.694046169,LastTimestamp:2026-03-07 21:15:43.141719267 +0000 UTC m=+76.694046169,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:16:52.390139 master-0 kubenswrapper[7689]: I0307 21:16:52.390012 7689 generic.go:334] "Generic (PLEG): container finished" podID="24f69689-ff12-4786-af05-61429e9eadf8" containerID="c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232" exitCode=0 Mar 07 21:16:53.207830 master-0 kubenswrapper[7689]: I0307 21:16:53.207697 7689 patch_prober.go:28] interesting pod/catalog-operator-7d9c49f57b-j454x container/catalog-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.207830 master-0 kubenswrapper[7689]: I0307 21:16:53.207768 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" podUID="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" Mar 07 21:16:53.208370 master-0 kubenswrapper[7689]: I0307 21:16:53.207911 7689 patch_prober.go:28] interesting pod/catalog-operator-7d9c49f57b-j454x container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.208370 master-0 kubenswrapper[7689]: I0307 21:16:53.208005 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" podUID="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": dial tcp 10.128.0.25:8443: connect: connection refused" Mar 07 21:16:53.208580 master-0 kubenswrapper[7689]: I0307 21:16:53.208385 7689 patch_prober.go:28] interesting pod/olm-operator-d64cfc9db-qd6xh container/olm-operator namespace/openshift-operator-lifecycle-manager: Liveness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.208580 master-0 kubenswrapper[7689]: I0307 21:16:53.208453 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" podUID="69851821-e1fc-44a8-98df-0cfe9d564126" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Mar 07 21:16:53.208580 master-0 kubenswrapper[7689]: I0307 21:16:53.208469 7689 patch_prober.go:28] interesting pod/olm-operator-d64cfc9db-qd6xh container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.208912 master-0 kubenswrapper[7689]: I0307 21:16:53.208561 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" podUID="69851821-e1fc-44a8-98df-0cfe9d564126" containerName="olm-operator" probeResult="failure" output="Get \"https://10.128.0.23:8443/healthz\": dial tcp 10.128.0.23:8443: connect: connection refused" Mar 07 21:16:53.605946 master-0 kubenswrapper[7689]: I0307 21:16:53.605814 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Liveness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.605946 master-0 kubenswrapper[7689]: I0307 21:16:53.605875 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:53.605946 master-0 kubenswrapper[7689]: I0307 21:16:53.605903 7689 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:53.605946 master-0 kubenswrapper[7689]: I0307 21:16:53.605938 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:54.406285 master-0 kubenswrapper[7689]: I0307 21:16:54.406200 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_bc5c4a14-0fdc-4c09-abda-7a2277a20c54/installer/0.log" Mar 07 21:16:54.406285 master-0 kubenswrapper[7689]: I0307 21:16:54.406291 7689 generic.go:334] "Generic (PLEG): container finished" podID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerID="f731d58484b6e995b134d609352f74f3a18338de0be2a0cddb04f00bff760ac6" exitCode=1 Mar 07 21:16:56.041747 master-0 kubenswrapper[7689]: E0307 21:16:56.041661 7689 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="7.339s" Mar 07 21:16:56.041747 master-0 kubenswrapper[7689]: I0307 21:16:56.041745 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:56.042651 master-0 kubenswrapper[7689]: I0307 21:16:56.041785 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:56.051032 master-0 kubenswrapper[7689]: I0307 21:16:56.050956 7689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0-master-0" podUID="" Mar 07 21:16:56.058919 master-0 kubenswrapper[7689]: W0307 21:16:56.058856 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ba27b7c_a93d_4d6e_a8f2_ec15903dd00c.slice/crio-511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16 WatchSource:0}: Error finding container 511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16: Status 404 returned error can't find the container with id 511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16 Mar 07 21:16:56.071532 master-0 kubenswrapper[7689]: I0307 21:16:56.071473 7689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:56.071889 master-0 kubenswrapper[7689]: I0307 21:16:56.071827 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:56.071975 master-0 kubenswrapper[7689]: I0307 21:16:56.071908 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerDied","Data":"e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54"} Mar 07 21:16:56.071975 master-0 kubenswrapper[7689]: I0307 21:16:56.071946 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" event={"ID":"b88c5fbe-e19f-45b3-ab03-e1626f95776d","Type":"ContainerDied","Data":"4dd4ab96de66a81d1a97cd72bb912ec500681a0000024a0cfaf545c2eaf36106"} Mar 07 21:16:56.071975 master-0 kubenswrapper[7689]: I0307 21:16:56.071963 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" event={"ID":"5f82d4aa-0cb5-477f-944e-745a21d124fc","Type":"ContainerDied","Data":"42f741a1d8745f4ba4855310764e131077825a56cb2981843ca7f7c641b06c4d"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.071981 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerDied","Data":"98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072020 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072034 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072045 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072057 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072068 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072080 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerDied","Data":"7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072096 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerDied","Data":"ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072113 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" event={"ID":"69851821-e1fc-44a8-98df-0cfe9d564126","Type":"ContainerDied","Data":"7aaed8a833b3068593d26b6804ec3a006285f7a402c4ef65546ea1c84ea6ae4d"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072128 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerStarted","Data":"422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072141 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerDied","Data":"98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247"} Mar 07 21:16:56.072144 master-0 kubenswrapper[7689]: I0307 21:16:56.072157 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" event={"ID":"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149","Type":"ContainerDied","Data":"6690322ef152ddb1743025780f4e212cb381fc5357beb0407cc2777292df2c5a"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072174 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerStarted","Data":"c83498128763a2f148ac39982dea44c5fce21b488aae118bfb334b72079782c3"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072190 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerStarted","Data":"0c0b389df5a30d4ee03cfc1ba37848c4943ddd2770dea8c045d43b6813299002"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072205 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072224 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"554ffc5919fe7a46fc0ad2b26594bc2dec62e5f792ce74d74fe8d549af25bf01"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072237 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" event={"ID":"e543d99f-e0dc-49be-95bd-c39eabd05ce8","Type":"ContainerStarted","Data":"51518dd6d57ee6b2083f82835ea17520ae78e6eccb08b7f3df5d87f49c47cb9b"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072254 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" event={"ID":"3faedef9-d507-48aa-82a8-f3dc9b5adeef","Type":"ContainerStarted","Data":"4e33c5e8c3d0187cfc4346672ae44f031e72cbbaa0018c58d07a068f776b253c"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072269 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-master-0" event={"ID":"ddc814a4-b865-4a35-b5f8-f54af449fe25","Type":"ContainerDied","Data":"463d8bfc31fe475b18975fa1110d938e01959c570bcc75066d9a8d30bafab290"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072288 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463d8bfc31fe475b18975fa1110d938e01959c570bcc75066d9a8d30bafab290" Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072301 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerDied","Data":"c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232"} Mar 07 21:16:56.072630 master-0 kubenswrapper[7689]: I0307 21:16:56.072319 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"bc5c4a14-0fdc-4c09-abda-7a2277a20c54","Type":"ContainerDied","Data":"f731d58484b6e995b134d609352f74f3a18338de0be2a0cddb04f00bff760ac6"} Mar 07 21:16:56.073075 master-0 kubenswrapper[7689]: I0307 21:16:56.072995 7689 scope.go:117] "RemoveContainer" containerID="1720f06011ab4886e92b7c5a8e88d7c953f6ae789c60589ab28e6980a7428f51" Mar 07 21:16:56.073124 master-0 kubenswrapper[7689]: I0307 21:16:56.073030 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:56.073227 master-0 kubenswrapper[7689]: I0307 21:16:56.073158 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:56.074021 master-0 kubenswrapper[7689]: I0307 21:16:56.073987 7689 scope.go:117] "RemoveContainer" containerID="4dd4ab96de66a81d1a97cd72bb912ec500681a0000024a0cfaf545c2eaf36106" Mar 07 21:16:56.074952 master-0 kubenswrapper[7689]: I0307 21:16:56.074920 7689 scope.go:117] "RemoveContainer" containerID="98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429" Mar 07 21:16:56.075217 master-0 kubenswrapper[7689]: I0307 21:16:56.075164 7689 scope.go:117] "RemoveContainer" containerID="98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247" Mar 07 21:16:56.075836 master-0 kubenswrapper[7689]: I0307 21:16:56.075801 7689 scope.go:117] "RemoveContainer" containerID="c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232" Mar 07 21:16:56.076799 master-0 kubenswrapper[7689]: I0307 21:16:56.076726 7689 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="openshift-config-operator" containerStatusID={"Type":"cri-o","ID":"422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc"} pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" containerMessage="Container openshift-config-operator failed liveness probe, will be restarted" Mar 07 21:16:56.076895 master-0 kubenswrapper[7689]: I0307 21:16:56.076813 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" containerID="cri-o://422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc" gracePeriod=30 Mar 07 21:16:56.077303 master-0 kubenswrapper[7689]: I0307 21:16:56.077254 7689 scope.go:117] "RemoveContainer" containerID="e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54" Mar 07 21:16:56.079098 master-0 kubenswrapper[7689]: I0307 21:16:56.079016 7689 scope.go:117] "RemoveContainer" containerID="7aaed8a833b3068593d26b6804ec3a006285f7a402c4ef65546ea1c84ea6ae4d" Mar 07 21:16:56.079808 master-0 kubenswrapper[7689]: I0307 21:16:56.079762 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:16:56.080126 master-0 kubenswrapper[7689]: I0307 21:16:56.080089 7689 scope.go:117] "RemoveContainer" containerID="6690322ef152ddb1743025780f4e212cb381fc5357beb0407cc2777292df2c5a" Mar 07 21:16:56.088394 master-0 kubenswrapper[7689]: I0307 21:16:56.087931 7689 scope.go:117] "RemoveContainer" containerID="42f741a1d8745f4ba4855310764e131077825a56cb2981843ca7f7c641b06c4d" Mar 07 21:16:56.088616 master-0 kubenswrapper[7689]: I0307 21:16:56.088568 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 07 21:16:56.088616 master-0 kubenswrapper[7689]: I0307 21:16:56.088601 7689 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="6dc999d8-497c-4cd8-9aed-35f8cf6c69ed" Mar 07 21:16:56.149753 master-0 kubenswrapper[7689]: I0307 21:16:56.149693 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0-master-0"] Mar 07 21:16:56.150052 master-0 kubenswrapper[7689]: I0307 21:16:56.150028 7689 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-etcd/etcd-master-0-master-0" mirrorPodUID="6dc999d8-497c-4cd8-9aed-35f8cf6c69ed" Mar 07 21:16:56.150124 master-0 kubenswrapper[7689]: I0307 21:16:56.150114 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz"] Mar 07 21:16:56.152312 master-0 kubenswrapper[7689]: I0307 21:16:56.152297 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:16:56.153883 master-0 kubenswrapper[7689]: I0307 21:16:56.153825 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/installer-1-master-0"] Mar 07 21:16:56.450923 master-0 kubenswrapper[7689]: I0307 21:16:56.450873 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" event={"ID":"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c","Type":"ContainerStarted","Data":"511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16"} Mar 07 21:16:56.460077 master-0 kubenswrapper[7689]: I0307 21:16:56.460039 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" event={"ID":"b88c5fbe-e19f-45b3-ab03-e1626f95776d","Type":"ContainerStarted","Data":"6515463db6f9ce965be84cb71c1f0e20d94e857c0b4493dc68cf3bf5a9e7f345"} Mar 07 21:16:56.485532 master-0 kubenswrapper[7689]: I0307 21:16:56.485488 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-cb227_29624e4f-d970-4dfa-a8f1-515b73397c8f/openshift-config-operator/1.log" Mar 07 21:16:56.487580 master-0 kubenswrapper[7689]: I0307 21:16:56.487539 7689 generic.go:334] "Generic (PLEG): container finished" podID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerID="422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc" exitCode=255 Mar 07 21:16:56.487870 master-0 kubenswrapper[7689]: I0307 21:16:56.487833 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerDied","Data":"422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc"} Mar 07 21:16:56.487918 master-0 kubenswrapper[7689]: I0307 21:16:56.487896 7689 scope.go:117] "RemoveContainer" containerID="7a9945baea4c13f880fbc215f8a1631a572c12331242f734424a747e14d18656" Mar 07 21:16:56.508046 master-0 kubenswrapper[7689]: I0307 21:16:56.508004 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:56.516620 master-0 kubenswrapper[7689]: E0307 21:16:56.516578 7689 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 07 21:16:56.605918 master-0 kubenswrapper[7689]: I0307 21:16:56.605857 7689 patch_prober.go:28] interesting pod/openshift-config-operator-64488f9d78-cb227 container/openshift-config-operator namespace/openshift-config-operator: Readiness probe status=failure output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" start-of-body= Mar 07 21:16:56.606035 master-0 kubenswrapper[7689]: I0307 21:16:56.605939 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" podUID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerName="openshift-config-operator" probeResult="failure" output="Get \"https://10.128.0.14:8443/healthz\": dial tcp 10.128.0.14:8443: connect: connection refused" Mar 07 21:16:56.702792 master-0 kubenswrapper[7689]: I0307 21:16:56.701635 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576e332a-c381-4582-bb5e-02d32bb376a4" path="/var/lib/kubelet/pods/576e332a-c381-4582-bb5e-02d32bb376a4/volumes" Mar 07 21:16:56.800578 master-0 kubenswrapper[7689]: I0307 21:16:56.800363 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=3.800341907 podStartE2EDuration="3.800341907s" podCreationTimestamp="2026-03-07 21:16:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:16:56.774337156 +0000 UTC m=+150.326664038" watchObservedRunningTime="2026-03-07 21:16:56.800341907 +0000 UTC m=+150.352668809" Mar 07 21:16:56.821778 master-0 kubenswrapper[7689]: I0307 21:16:56.821725 7689 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:16:56.839259 master-0 kubenswrapper[7689]: I0307 21:16:56.837793 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_bc5c4a14-0fdc-4c09-abda-7a2277a20c54/installer/0.log" Mar 07 21:16:56.839259 master-0 kubenswrapper[7689]: I0307 21:16:56.837897 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:16:57.000216 master-0 kubenswrapper[7689]: I0307 21:16:57.000152 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access\") pod \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " Mar 07 21:16:57.000216 master-0 kubenswrapper[7689]: I0307 21:16:57.000228 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock\") pod \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " Mar 07 21:16:57.000456 master-0 kubenswrapper[7689]: I0307 21:16:57.000300 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir\") pod \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\" (UID: \"bc5c4a14-0fdc-4c09-abda-7a2277a20c54\") " Mar 07 21:16:57.000609 master-0 kubenswrapper[7689]: I0307 21:16:57.000578 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "bc5c4a14-0fdc-4c09-abda-7a2277a20c54" (UID: "bc5c4a14-0fdc-4c09-abda-7a2277a20c54"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:57.000650 master-0 kubenswrapper[7689]: I0307 21:16:57.000610 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock" (OuterVolumeSpecName: "var-lock") pod "bc5c4a14-0fdc-4c09-abda-7a2277a20c54" (UID: "bc5c4a14-0fdc-4c09-abda-7a2277a20c54"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:16:57.005834 master-0 kubenswrapper[7689]: I0307 21:16:57.004834 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "bc5c4a14-0fdc-4c09-abda-7a2277a20c54" (UID: "bc5c4a14-0fdc-4c09-abda-7a2277a20c54"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:16:57.013995 master-0 kubenswrapper[7689]: I0307 21:16:57.013929 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:16:57.102224 master-0 kubenswrapper[7689]: I0307 21:16:57.102147 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:57.102224 master-0 kubenswrapper[7689]: I0307 21:16:57.102187 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:57.102224 master-0 kubenswrapper[7689]: I0307 21:16:57.102201 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/bc5c4a14-0fdc-4c09-abda-7a2277a20c54-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:16:57.498255 master-0 kubenswrapper[7689]: I0307 21:16:57.498117 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_bc5c4a14-0fdc-4c09-abda-7a2277a20c54/installer/0.log" Mar 07 21:16:57.498598 master-0 kubenswrapper[7689]: I0307 21:16:57.498306 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-2-master-0" event={"ID":"bc5c4a14-0fdc-4c09-abda-7a2277a20c54","Type":"ContainerDied","Data":"ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d"} Mar 07 21:16:57.498598 master-0 kubenswrapper[7689]: I0307 21:16:57.498366 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d" Mar 07 21:16:57.498598 master-0 kubenswrapper[7689]: I0307 21:16:57.498521 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:16:57.502417 master-0 kubenswrapper[7689]: I0307 21:16:57.502056 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-qd6xh_69851821-e1fc-44a8-98df-0cfe9d564126/olm-operator/0.log" Mar 07 21:16:57.502417 master-0 kubenswrapper[7689]: I0307 21:16:57.502206 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" event={"ID":"69851821-e1fc-44a8-98df-0cfe9d564126","Type":"ContainerStarted","Data":"9968bd6a0004d10bd88bc4a569d6dae5fc03db823c481dcc1cd654d0b7888419"} Mar 07 21:16:57.502881 master-0 kubenswrapper[7689]: I0307 21:16:57.502795 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:16:57.506791 master-0 kubenswrapper[7689]: I0307 21:16:57.506723 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:16:57.506912 master-0 kubenswrapper[7689]: I0307 21:16:57.506872 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerStarted","Data":"bb011da2147b400e02dca81678c2674f7f4945ae82c7c12a0ca2e2e7f531abc9"} Mar 07 21:16:57.512714 master-0 kubenswrapper[7689]: I0307 21:16:57.509773 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-j454x_7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/catalog-operator/0.log" Mar 07 21:16:57.512714 master-0 kubenswrapper[7689]: I0307 21:16:57.509915 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" event={"ID":"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149","Type":"ContainerStarted","Data":"1632c04653266b7e2cb93a258b4bfdf511e1cad4198b68ac2ab0010b4fb828df"} Mar 07 21:16:57.512714 master-0 kubenswrapper[7689]: I0307 21:16:57.511415 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:16:57.516755 master-0 kubenswrapper[7689]: I0307 21:16:57.513567 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" event={"ID":"5f82d4aa-0cb5-477f-944e-745a21d124fc","Type":"ContainerStarted","Data":"1bab61317a32a2dec477d9b42b23b9a807e86e8cc79f0edbeecca2b500377458"} Mar 07 21:16:57.516755 master-0 kubenswrapper[7689]: I0307 21:16:57.516490 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:16:57.518800 master-0 kubenswrapper[7689]: I0307 21:16:57.518756 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerStarted","Data":"39007789be66eb488faa54345a100705571cfad0f002f23e0dcd219cdce1ebd3"} Mar 07 21:16:57.521916 master-0 kubenswrapper[7689]: I0307 21:16:57.521877 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-cb227_29624e4f-d970-4dfa-a8f1-515b73397c8f/openshift-config-operator/1.log" Mar 07 21:16:57.522240 master-0 kubenswrapper[7689]: I0307 21:16:57.522204 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" event={"ID":"29624e4f-d970-4dfa-a8f1-515b73397c8f","Type":"ContainerStarted","Data":"39b316f007b3a6bc156dcc3c9bc42807bb03f195281b137e19d8d61a53142b5b"} Mar 07 21:16:57.522631 master-0 kubenswrapper[7689]: I0307 21:16:57.522599 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:57.524822 master-0 kubenswrapper[7689]: I0307 21:16:57.524785 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerStarted","Data":"f61c9664ad5014f7591f08646987ba716f66b5b9dca224d83b995060556f0add"} Mar 07 21:16:57.531616 master-0 kubenswrapper[7689]: I0307 21:16:57.531573 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/0.log" Mar 07 21:16:57.532634 master-0 kubenswrapper[7689]: I0307 21:16:57.532587 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerStarted","Data":"6c5c7fd45d6f80f9f78c1d57d4b829fe4f9dc0f4710c478f224a6b64ce861f57"} Mar 07 21:16:58.539991 master-0 kubenswrapper[7689]: I0307 21:16:58.539752 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" event={"ID":"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c","Type":"ContainerStarted","Data":"27a84f3840d7bd704fbc6124aef0ebf7f4eef91692b179d35440231e945dc9fe"} Mar 07 21:16:59.556155 master-0 kubenswrapper[7689]: I0307 21:16:59.556065 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:16:59.592013 master-0 kubenswrapper[7689]: I0307 21:16:59.591874 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" podStartSLOduration=77.399244783 podStartE2EDuration="1m19.591841115s" podCreationTimestamp="2026-03-07 21:15:40 +0000 UTC" firstStartedPulling="2026-03-07 21:16:56.061676034 +0000 UTC m=+149.614002926" lastFinishedPulling="2026-03-07 21:16:58.254272366 +0000 UTC m=+151.806599258" observedRunningTime="2026-03-07 21:16:58.578626357 +0000 UTC m=+152.130953289" watchObservedRunningTime="2026-03-07 21:16:59.591841115 +0000 UTC m=+153.144168037" Mar 07 21:17:04.337124 master-0 kubenswrapper[7689]: I0307 21:17:04.336998 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:04.343667 master-0 kubenswrapper[7689]: I0307 21:17:04.343598 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:04.587673 master-0 kubenswrapper[7689]: I0307 21:17:04.587443 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:12.630478 master-0 kubenswrapper[7689]: I0307 21:17:12.630438 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-tklw9_47ecf172-666e-4360-97ff-bd9dbccc1fd6/ingress-operator/0.log" Mar 07 21:17:12.631196 master-0 kubenswrapper[7689]: I0307 21:17:12.631169 7689 generic.go:334] "Generic (PLEG): container finished" podID="47ecf172-666e-4360-97ff-bd9dbccc1fd6" containerID="d9c9700ef3cdaba6833e00d44e39806385f696f37ff17a4df92695c36e563c13" exitCode=1 Mar 07 21:17:12.631302 master-0 kubenswrapper[7689]: I0307 21:17:12.631282 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" event={"ID":"47ecf172-666e-4360-97ff-bd9dbccc1fd6","Type":"ContainerDied","Data":"d9c9700ef3cdaba6833e00d44e39806385f696f37ff17a4df92695c36e563c13"} Mar 07 21:17:12.631851 master-0 kubenswrapper[7689]: I0307 21:17:12.631833 7689 scope.go:117] "RemoveContainer" containerID="d9c9700ef3cdaba6833e00d44e39806385f696f37ff17a4df92695c36e563c13" Mar 07 21:17:13.072847 master-0 kubenswrapper[7689]: I0307 21:17:13.072716 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: E0307 21:17:13.072961 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: I0307 21:17:13.072974 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: E0307 21:17:13.072984 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: I0307 21:17:13.072991 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: E0307 21:17:13.073016 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: I0307 21:17:13.073023 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: E0307 21:17:13.073035 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576e332a-c381-4582-bb5e-02d32bb376a4" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: I0307 21:17:13.073041 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="576e332a-c381-4582-bb5e-02d32bb376a4" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: E0307 21:17:13.073054 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:17:13.073061 master-0 kubenswrapper[7689]: I0307 21:17:13.073060 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:17:13.073421 master-0 kubenswrapper[7689]: I0307 21:17:13.073150 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:17:13.073421 master-0 kubenswrapper[7689]: I0307 21:17:13.073164 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:17:13.073421 master-0 kubenswrapper[7689]: I0307 21:17:13.073192 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:17:13.073421 master-0 kubenswrapper[7689]: I0307 21:17:13.073207 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:17:13.073421 master-0 kubenswrapper[7689]: I0307 21:17:13.073221 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="576e332a-c381-4582-bb5e-02d32bb376a4" containerName="installer" Mar 07 21:17:13.073645 master-0 kubenswrapper[7689]: I0307 21:17:13.073615 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.075604 master-0 kubenswrapper[7689]: I0307 21:17:13.075570 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-v4v2q" Mar 07 21:17:13.076874 master-0 kubenswrapper[7689]: I0307 21:17:13.076830 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 21:17:13.097086 master-0 kubenswrapper[7689]: I0307 21:17:13.097030 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 07 21:17:13.182182 master-0 kubenswrapper[7689]: I0307 21:17:13.182100 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.182445 master-0 kubenswrapper[7689]: I0307 21:17:13.182262 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.182499 master-0 kubenswrapper[7689]: I0307 21:17:13.182444 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.284301 master-0 kubenswrapper[7689]: I0307 21:17:13.284226 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.284301 master-0 kubenswrapper[7689]: I0307 21:17:13.284288 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.284711 master-0 kubenswrapper[7689]: I0307 21:17:13.284605 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.284711 master-0 kubenswrapper[7689]: I0307 21:17:13.284656 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.284912 master-0 kubenswrapper[7689]: I0307 21:17:13.284819 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.306635 master-0 kubenswrapper[7689]: I0307 21:17:13.306584 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.389821 master-0 kubenswrapper[7689]: I0307 21:17:13.389730 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:13.641360 master-0 kubenswrapper[7689]: I0307 21:17:13.640886 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-tklw9_47ecf172-666e-4360-97ff-bd9dbccc1fd6/ingress-operator/0.log" Mar 07 21:17:13.641360 master-0 kubenswrapper[7689]: I0307 21:17:13.641000 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" event={"ID":"47ecf172-666e-4360-97ff-bd9dbccc1fd6","Type":"ContainerStarted","Data":"017543294a55dc7e7e8c4e64c026ddee0e9502e377d28d5cc087ecdc76917bb4"} Mar 07 21:17:13.807481 master-0 kubenswrapper[7689]: I0307 21:17:13.807427 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-1-retry-1-master-0"] Mar 07 21:17:13.819012 master-0 kubenswrapper[7689]: W0307 21:17:13.818941 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2357c135_5d09_4657_9038_48d25ed55b2d.slice/crio-ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26 WatchSource:0}: Error finding container ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26: Status 404 returned error can't find the container with id ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26 Mar 07 21:17:14.564234 master-0 kubenswrapper[7689]: I0307 21:17:14.564075 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d"] Mar 07 21:17:14.565731 master-0 kubenswrapper[7689]: I0307 21:17:14.565662 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.568498 master-0 kubenswrapper[7689]: I0307 21:17:14.568419 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb"] Mar 07 21:17:14.569204 master-0 kubenswrapper[7689]: I0307 21:17:14.569179 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.569747 master-0 kubenswrapper[7689]: I0307 21:17:14.569705 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lvvbn" Mar 07 21:17:14.569747 master-0 kubenswrapper[7689]: I0307 21:17:14.569730 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 07 21:17:14.570791 master-0 kubenswrapper[7689]: I0307 21:17:14.570761 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:17:14.571532 master-0 kubenswrapper[7689]: I0307 21:17:14.571262 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wq5zr" Mar 07 21:17:14.571532 master-0 kubenswrapper[7689]: I0307 21:17:14.571298 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 07 21:17:14.571532 master-0 kubenswrapper[7689]: I0307 21:17:14.571369 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 07 21:17:14.571669 master-0 kubenswrapper[7689]: I0307 21:17:14.571648 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 07 21:17:14.572590 master-0 kubenswrapper[7689]: I0307 21:17:14.572568 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:17:14.578051 master-0 kubenswrapper[7689]: I0307 21:17:14.578015 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4"] Mar 07 21:17:14.578736 master-0 kubenswrapper[7689]: I0307 21:17:14.578706 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.581010 master-0 kubenswrapper[7689]: I0307 21:17:14.580606 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 07 21:17:14.581010 master-0 kubenswrapper[7689]: I0307 21:17:14.580892 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 21:17:14.581010 master-0 kubenswrapper[7689]: I0307 21:17:14.580922 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 21:17:14.581010 master-0 kubenswrapper[7689]: I0307 21:17:14.580936 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-7fv8q" Mar 07 21:17:14.581010 master-0 kubenswrapper[7689]: I0307 21:17:14.580935 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-vxpb5"] Mar 07 21:17:14.581336 master-0 kubenswrapper[7689]: I0307 21:17:14.581143 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 21:17:14.581954 master-0 kubenswrapper[7689]: I0307 21:17:14.581930 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.584143 master-0 kubenswrapper[7689]: I0307 21:17:14.584090 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-kd5ps" Mar 07 21:17:14.586464 master-0 kubenswrapper[7689]: I0307 21:17:14.586433 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rw59s"] Mar 07 21:17:14.587268 master-0 kubenswrapper[7689]: I0307 21:17:14.587236 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.589235 master-0 kubenswrapper[7689]: I0307 21:17:14.589192 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-df95k" Mar 07 21:17:14.605099 master-0 kubenswrapper[7689]: I0307 21:17:14.605065 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-rlx9x"] Mar 07 21:17:14.606090 master-0 kubenswrapper[7689]: I0307 21:17:14.606057 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.608939 master-0 kubenswrapper[7689]: I0307 21:17:14.608908 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 07 21:17:14.609305 master-0 kubenswrapper[7689]: I0307 21:17:14.609290 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 07 21:17:14.609729 master-0 kubenswrapper[7689]: I0307 21:17:14.609713 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lbvsg" Mar 07 21:17:14.609929 master-0 kubenswrapper[7689]: I0307 21:17:14.609893 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 07 21:17:14.610055 master-0 kubenswrapper[7689]: I0307 21:17:14.610030 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv"] Mar 07 21:17:14.610173 master-0 kubenswrapper[7689]: I0307 21:17:14.610160 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 07 21:17:14.610462 master-0 kubenswrapper[7689]: I0307 21:17:14.610447 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 07 21:17:14.611521 master-0 kubenswrapper[7689]: I0307 21:17:14.610812 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.617118 master-0 kubenswrapper[7689]: I0307 21:17:14.616424 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8"] Mar 07 21:17:14.617491 master-0 kubenswrapper[7689]: I0307 21:17:14.617460 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.617846 master-0 kubenswrapper[7689]: I0307 21:17:14.617818 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 21:17:14.618194 master-0 kubenswrapper[7689]: I0307 21:17:14.618178 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fswfb" Mar 07 21:17:14.618546 master-0 kubenswrapper[7689]: I0307 21:17:14.618532 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 21:17:14.618804 master-0 kubenswrapper[7689]: I0307 21:17:14.618788 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 21:17:14.619193 master-0 kubenswrapper[7689]: I0307 21:17:14.618795 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7"] Mar 07 21:17:14.619338 master-0 kubenswrapper[7689]: I0307 21:17:14.618871 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 21:17:14.619524 master-0 kubenswrapper[7689]: I0307 21:17:14.618871 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 21:17:14.620081 master-0 kubenswrapper[7689]: I0307 21:17:14.620038 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.622779 master-0 kubenswrapper[7689]: I0307 21:17:14.622230 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2z9v6" Mar 07 21:17:14.622779 master-0 kubenswrapper[7689]: I0307 21:17:14.622477 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 21:17:14.622779 master-0 kubenswrapper[7689]: I0307 21:17:14.622586 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 21:17:14.622779 master-0 kubenswrapper[7689]: I0307 21:17:14.622746 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 21:17:14.623186 master-0 kubenswrapper[7689]: I0307 21:17:14.622857 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 21:17:14.623186 master-0 kubenswrapper[7689]: I0307 21:17:14.623056 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 21:17:14.623275 master-0 kubenswrapper[7689]: I0307 21:17:14.623237 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 21:17:14.623417 master-0 kubenswrapper[7689]: I0307 21:17:14.623392 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 21:17:14.623550 master-0 kubenswrapper[7689]: I0307 21:17:14.623527 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 21:17:14.623733 master-0 kubenswrapper[7689]: I0307 21:17:14.623630 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-cdmkh" Mar 07 21:17:14.623802 master-0 kubenswrapper[7689]: I0307 21:17:14.623770 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn"] Mar 07 21:17:14.627763 master-0 kubenswrapper[7689]: I0307 21:17:14.627722 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z2cc9"] Mar 07 21:17:14.629648 master-0 kubenswrapper[7689]: I0307 21:17:14.629582 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.632799 master-0 kubenswrapper[7689]: I0307 21:17:14.632306 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-b6gqw" Mar 07 21:17:14.633841 master-0 kubenswrapper[7689]: I0307 21:17:14.633810 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.634817 master-0 kubenswrapper[7689]: I0307 21:17:14.634762 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 07 21:17:14.640978 master-0 kubenswrapper[7689]: I0307 21:17:14.640928 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-lj4zb" Mar 07 21:17:14.644728 master-0 kubenswrapper[7689]: I0307 21:17:14.644661 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74"] Mar 07 21:17:14.652196 master-0 kubenswrapper[7689]: I0307 21:17:14.652143 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-fdltd"] Mar 07 21:17:14.652343 master-0 kubenswrapper[7689]: I0307 21:17:14.652307 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.653331 master-0 kubenswrapper[7689]: I0307 21:17:14.653290 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.653674 master-0 kubenswrapper[7689]: I0307 21:17:14.653646 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4"] Mar 07 21:17:14.654439 master-0 kubenswrapper[7689]: I0307 21:17:14.654396 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 07 21:17:14.654439 master-0 kubenswrapper[7689]: I0307 21:17:14.654402 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"2357c135-5d09-4657-9038-48d25ed55b2d","Type":"ContainerStarted","Data":"c99ad91f1912453e3999a78e354c969699bc344538ab4adcf769bc12a98842c2"} Mar 07 21:17:14.654556 master-0 kubenswrapper[7689]: I0307 21:17:14.654457 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"2357c135-5d09-4657-9038-48d25ed55b2d","Type":"ContainerStarted","Data":"ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26"} Mar 07 21:17:14.654556 master-0 kubenswrapper[7689]: I0307 21:17:14.654426 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-5m62w" Mar 07 21:17:14.657779 master-0 kubenswrapper[7689]: I0307 21:17:14.656888 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2tlv4" Mar 07 21:17:14.657779 master-0 kubenswrapper[7689]: I0307 21:17:14.656919 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 07 21:17:14.657779 master-0 kubenswrapper[7689]: I0307 21:17:14.656954 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 07 21:17:14.657779 master-0 kubenswrapper[7689]: I0307 21:17:14.656975 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 07 21:17:14.660348 master-0 kubenswrapper[7689]: I0307 21:17:14.660293 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxpb5"] Mar 07 21:17:14.662552 master-0 kubenswrapper[7689]: I0307 21:17:14.662508 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7"] Mar 07 21:17:14.665135 master-0 kubenswrapper[7689]: I0307 21:17:14.665072 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb"] Mar 07 21:17:14.670757 master-0 kubenswrapper[7689]: I0307 21:17:14.670706 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8"] Mar 07 21:17:14.673350 master-0 kubenswrapper[7689]: I0307 21:17:14.673316 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn"] Mar 07 21:17:14.676119 master-0 kubenswrapper[7689]: I0307 21:17:14.676031 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-rlx9x"] Mar 07 21:17:14.679610 master-0 kubenswrapper[7689]: I0307 21:17:14.679542 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74"] Mar 07 21:17:14.694829 master-0 kubenswrapper[7689]: I0307 21:17:14.694791 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rw59s"] Mar 07 21:17:14.694829 master-0 kubenswrapper[7689]: I0307 21:17:14.694830 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdltd"] Mar 07 21:17:14.695379 master-0 kubenswrapper[7689]: I0307 21:17:14.695351 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2cc9"] Mar 07 21:17:14.719239 master-0 kubenswrapper[7689]: I0307 21:17:14.719187 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.719239 master-0 kubenswrapper[7689]: I0307 21:17:14.719231 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719258 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-drnv4\" (UniqueName: \"kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719277 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719295 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719316 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719390 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719511 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719538 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719578 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719605 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzg72\" (UniqueName: \"kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719624 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719648 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719670 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719703 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719721 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqtbf\" (UniqueName: \"kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719743 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5n27m\" (UniqueName: \"kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719761 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719779 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719797 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719817 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719835 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f8mm9\" (UniqueName: \"kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719861 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg2ch\" (UniqueName: \"kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719882 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719906 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719928 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwj77\" (UniqueName: \"kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719954 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.719977 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.720003 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hqxlr\" (UniqueName: \"kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.720025 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.719841 master-0 kubenswrapper[7689]: I0307 21:17:14.720047 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.749946 master-0 kubenswrapper[7689]: I0307 21:17:14.749872 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" podStartSLOduration=1.74985207 podStartE2EDuration="1.74985207s" podCreationTimestamp="2026-03-07 21:17:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:14.748506177 +0000 UTC m=+168.300833079" watchObservedRunningTime="2026-03-07 21:17:14.74985207 +0000 UTC m=+168.302178952" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821596 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821652 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg2ch\" (UniqueName: \"kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821674 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821712 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821732 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwj77\" (UniqueName: \"kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821751 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.821760 master-0 kubenswrapper[7689]: I0307 21:17:14.821768 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821786 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxlr\" (UniqueName: \"kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821830 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821858 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-khdpn\" (UniqueName: \"kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821913 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821955 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.821984 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mzlv\" (UniqueName: \"kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822053 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822085 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822131 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822152 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822168 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnv4\" (UniqueName: \"kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822205 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822223 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822243 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822285 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822288 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822305 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822324 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822363 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822382 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.822354 master-0 kubenswrapper[7689]: I0307 21:17:14.822402 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2b28m\" (UniqueName: \"kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822447 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzg72\" (UniqueName: \"kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822473 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822523 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktjs9\" (UniqueName: \"kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822558 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822581 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822904 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822957 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.822985 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtbf\" (UniqueName: \"kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823033 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n27m\" (UniqueName: \"kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823074 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823128 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823155 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823204 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.823801 master-0 kubenswrapper[7689]: I0307 21:17:14.823227 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.824596 master-0 kubenswrapper[7689]: I0307 21:17:14.823821 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.824596 master-0 kubenswrapper[7689]: I0307 21:17:14.824127 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.824596 master-0 kubenswrapper[7689]: I0307 21:17:14.824166 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mm9\" (UniqueName: \"kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.824596 master-0 kubenswrapper[7689]: I0307 21:17:14.824221 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9mmg\" (UniqueName: \"kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.824596 master-0 kubenswrapper[7689]: I0307 21:17:14.824378 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.825415 master-0 kubenswrapper[7689]: I0307 21:17:14.825348 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.825855 master-0 kubenswrapper[7689]: I0307 21:17:14.825821 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.826788 master-0 kubenswrapper[7689]: I0307 21:17:14.826747 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.827927 master-0 kubenswrapper[7689]: I0307 21:17:14.827875 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.828666 master-0 kubenswrapper[7689]: I0307 21:17:14.828623 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.830099 master-0 kubenswrapper[7689]: I0307 21:17:14.830052 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.830953 master-0 kubenswrapper[7689]: I0307 21:17:14.830896 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.831245 master-0 kubenswrapper[7689]: I0307 21:17:14.831197 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.831934 master-0 kubenswrapper[7689]: I0307 21:17:14.831859 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.833389 master-0 kubenswrapper[7689]: I0307 21:17:14.833332 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.833984 master-0 kubenswrapper[7689]: I0307 21:17:14.833934 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.833984 master-0 kubenswrapper[7689]: I0307 21:17:14.833940 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.834121 master-0 kubenswrapper[7689]: I0307 21:17:14.834023 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.834121 master-0 kubenswrapper[7689]: I0307 21:17:14.834050 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.834121 master-0 kubenswrapper[7689]: I0307 21:17:14.834108 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.834285 master-0 kubenswrapper[7689]: I0307 21:17:14.834209 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.834335 master-0 kubenswrapper[7689]: I0307 21:17:14.834284 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.834381 master-0 kubenswrapper[7689]: I0307 21:17:14.834362 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.834564 master-0 kubenswrapper[7689]: I0307 21:17:14.834525 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.852824 master-0 kubenswrapper[7689]: I0307 21:17:14.852138 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.855743 master-0 kubenswrapper[7689]: I0307 21:17:14.855611 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg2ch\" (UniqueName: \"kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch\") pod \"cluster-cloud-controller-manager-operator-559568b945-pmr9d\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.855953 master-0 kubenswrapper[7689]: I0307 21:17:14.855756 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxlr\" (UniqueName: \"kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.856405 master-0 kubenswrapper[7689]: I0307 21:17:14.856356 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n27m\" (UniqueName: \"kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:14.857586 master-0 kubenswrapper[7689]: I0307 21:17:14.857549 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mm9\" (UniqueName: \"kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.857795 master-0 kubenswrapper[7689]: I0307 21:17:14.857730 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtbf\" (UniqueName: \"kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:14.858485 master-0 kubenswrapper[7689]: I0307 21:17:14.858442 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnv4\" (UniqueName: \"kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.860230 master-0 kubenswrapper[7689]: I0307 21:17:14.860199 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwj77\" (UniqueName: \"kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:14.860450 master-0 kubenswrapper[7689]: I0307 21:17:14.860406 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzg72\" (UniqueName: \"kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72\") pod \"machine-approver-955fcfb87-cwdkv\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:14.884452 master-0 kubenswrapper[7689]: I0307 21:17:14.884359 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:14.905612 master-0 kubenswrapper[7689]: I0307 21:17:14.905580 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:14.923416 master-0 kubenswrapper[7689]: I0307 21:17:14.923359 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:14.926639 master-0 kubenswrapper[7689]: I0307 21:17:14.926580 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mmg\" (UniqueName: \"kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.926746 master-0 kubenswrapper[7689]: I0307 21:17:14.926638 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.926801 master-0 kubenswrapper[7689]: I0307 21:17:14.926773 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdpn\" (UniqueName: \"kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.926856 master-0 kubenswrapper[7689]: I0307 21:17:14.926824 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.926899 master-0 kubenswrapper[7689]: I0307 21:17:14.926854 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzlv\" (UniqueName: \"kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.926899 master-0 kubenswrapper[7689]: I0307 21:17:14.926883 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.926980 master-0 kubenswrapper[7689]: I0307 21:17:14.926924 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.926980 master-0 kubenswrapper[7689]: I0307 21:17:14.926949 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.926980 master-0 kubenswrapper[7689]: I0307 21:17:14.926968 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b28m\" (UniqueName: \"kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.927724 master-0 kubenswrapper[7689]: I0307 21:17:14.927624 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.928398 master-0 kubenswrapper[7689]: I0307 21:17:14.928373 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.928543 master-0 kubenswrapper[7689]: I0307 21:17:14.928503 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.930568 master-0 kubenswrapper[7689]: I0307 21:17:14.928169 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjs9\" (UniqueName: \"kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.930568 master-0 kubenswrapper[7689]: I0307 21:17:14.930120 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.930568 master-0 kubenswrapper[7689]: I0307 21:17:14.930172 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.930568 master-0 kubenswrapper[7689]: I0307 21:17:14.930200 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.932140 master-0 kubenswrapper[7689]: I0307 21:17:14.930653 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.937183 master-0 kubenswrapper[7689]: I0307 21:17:14.935741 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.937183 master-0 kubenswrapper[7689]: I0307 21:17:14.937027 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.939756 master-0 kubenswrapper[7689]: I0307 21:17:14.939704 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.944430 master-0 kubenswrapper[7689]: I0307 21:17:14.944387 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.947503 master-0 kubenswrapper[7689]: I0307 21:17:14.947045 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:14.947503 master-0 kubenswrapper[7689]: I0307 21:17:14.947116 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzlv\" (UniqueName: \"kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:14.947503 master-0 kubenswrapper[7689]: I0307 21:17:14.947121 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdpn\" (UniqueName: \"kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:14.948124 master-0 kubenswrapper[7689]: I0307 21:17:14.947899 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mmg\" (UniqueName: \"kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:14.951514 master-0 kubenswrapper[7689]: I0307 21:17:14.951469 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b28m\" (UniqueName: \"kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:14.951794 master-0 kubenswrapper[7689]: I0307 21:17:14.951759 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjs9\" (UniqueName: \"kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:14.981952 master-0 kubenswrapper[7689]: I0307 21:17:14.981469 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:15.000181 master-0 kubenswrapper[7689]: I0307 21:17:14.999400 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:15.015169 master-0 kubenswrapper[7689]: I0307 21:17:15.015079 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:15.032549 master-0 kubenswrapper[7689]: I0307 21:17:15.032480 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:15.033752 master-0 kubenswrapper[7689]: W0307 21:17:15.033721 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53e19dea_e8cb_478d_90da_3820712d6ac9.slice/crio-ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2 WatchSource:0}: Error finding container ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2: Status 404 returned error can't find the container with id ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2 Mar 07 21:17:15.058467 master-0 kubenswrapper[7689]: I0307 21:17:15.058002 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:15.073061 master-0 kubenswrapper[7689]: I0307 21:17:15.072477 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:15.084747 master-0 kubenswrapper[7689]: I0307 21:17:15.082592 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:15.099893 master-0 kubenswrapper[7689]: I0307 21:17:15.099420 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:15.112490 master-0 kubenswrapper[7689]: I0307 21:17:15.111894 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:15.334456 master-0 kubenswrapper[7689]: I0307 21:17:15.334358 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb"] Mar 07 21:17:15.356862 master-0 kubenswrapper[7689]: W0307 21:17:15.356751 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb12701eb_4226_4f9c_9398_ad0c3fea7451.slice/crio-57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144 WatchSource:0}: Error finding container 57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144: Status 404 returned error can't find the container with id 57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144 Mar 07 21:17:15.430006 master-0 kubenswrapper[7689]: I0307 21:17:15.429943 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-vxpb5"] Mar 07 21:17:15.435033 master-0 kubenswrapper[7689]: I0307 21:17:15.434986 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4"] Mar 07 21:17:15.443674 master-0 kubenswrapper[7689]: W0307 21:17:15.443627 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf08edf29_c53f_452d_880b_e8ce27b05b6f.slice/crio-651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe WatchSource:0}: Error finding container 651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe: Status 404 returned error can't find the container with id 651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe Mar 07 21:17:15.446042 master-0 kubenswrapper[7689]: W0307 21:17:15.446019 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46d1b044_16fb_4442_a554_6b15a8a1c8ae.slice/crio-39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb WatchSource:0}: Error finding container 39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb: Status 404 returned error can't find the container with id 39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb Mar 07 21:17:15.574467 master-0 kubenswrapper[7689]: I0307 21:17:15.574378 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7"] Mar 07 21:17:15.589303 master-0 kubenswrapper[7689]: I0307 21:17:15.577696 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-8f89dfddd-rlx9x"] Mar 07 21:17:15.589303 master-0 kubenswrapper[7689]: I0307 21:17:15.584422 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rw59s"] Mar 07 21:17:15.665511 master-0 kubenswrapper[7689]: I0307 21:17:15.665413 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" event={"ID":"b12701eb-4226-4f9c-9398-ad0c3fea7451","Type":"ContainerStarted","Data":"237a09ab11d4f645f3a091648441b51fbc332dc15c86991d76a74ce7eab81510"} Mar 07 21:17:15.665511 master-0 kubenswrapper[7689]: I0307 21:17:15.665479 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" event={"ID":"b12701eb-4226-4f9c-9398-ad0c3fea7451","Type":"ContainerStarted","Data":"57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144"} Mar 07 21:17:15.667816 master-0 kubenswrapper[7689]: I0307 21:17:15.667618 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw59s" event={"ID":"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9","Type":"ContainerStarted","Data":"c7a270720447e0a61bb1c8ec80a8415d28e52795162c44c7229c8de5a130a13d"} Mar 07 21:17:15.670131 master-0 kubenswrapper[7689]: I0307 21:17:15.670076 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerStarted","Data":"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3"} Mar 07 21:17:15.670131 master-0 kubenswrapper[7689]: I0307 21:17:15.670104 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerStarted","Data":"ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2"} Mar 07 21:17:15.673137 master-0 kubenswrapper[7689]: I0307 21:17:15.672022 7689 generic.go:334] "Generic (PLEG): container finished" podID="f08edf29-c53f-452d-880b-e8ce27b05b6f" containerID="cbc42b42c68bace1ed2fefd81d3e1aeee69e8aeb452acfb8ef11e0a4a41a9443" exitCode=0 Mar 07 21:17:15.673137 master-0 kubenswrapper[7689]: I0307 21:17:15.672073 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxpb5" event={"ID":"f08edf29-c53f-452d-880b-e8ce27b05b6f","Type":"ContainerDied","Data":"cbc42b42c68bace1ed2fefd81d3e1aeee69e8aeb452acfb8ef11e0a4a41a9443"} Mar 07 21:17:15.673137 master-0 kubenswrapper[7689]: I0307 21:17:15.672091 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxpb5" event={"ID":"f08edf29-c53f-452d-880b-e8ce27b05b6f","Type":"ContainerStarted","Data":"651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe"} Mar 07 21:17:15.677927 master-0 kubenswrapper[7689]: I0307 21:17:15.677864 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" event={"ID":"46d1b044-16fb-4442-a554-6b15a8a1c8ae","Type":"ContainerStarted","Data":"7cd04186fec1f3689550529aeffb9a5d6e61f2237b907fad318acf8b8b3b0642"} Mar 07 21:17:15.678037 master-0 kubenswrapper[7689]: I0307 21:17:15.677929 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" event={"ID":"46d1b044-16fb-4442-a554-6b15a8a1c8ae","Type":"ContainerStarted","Data":"39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb"} Mar 07 21:17:15.680493 master-0 kubenswrapper[7689]: I0307 21:17:15.680446 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" event={"ID":"8512a7f6-889f-483e-960f-1ce3c834e92c","Type":"ContainerStarted","Data":"cdde49fab8a3c629c252f1f7390a41b3c48bf77cd72b2434083e80efd11766cc"} Mar 07 21:17:15.682467 master-0 kubenswrapper[7689]: I0307 21:17:15.682432 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerStarted","Data":"e34e1ead14a1f46ae17df72bae6b5228c67e70a551aab1dde4319fa7bdff201f"} Mar 07 21:17:15.808178 master-0 kubenswrapper[7689]: W0307 21:17:15.808129 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f69a884_5fe8_4c03_8258_ff35396efc30.slice/crio-b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312 WatchSource:0}: Error finding container b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312: Status 404 returned error can't find the container with id b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312 Mar 07 21:17:15.808309 master-0 kubenswrapper[7689]: I0307 21:17:15.808274 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8"] Mar 07 21:17:15.820874 master-0 kubenswrapper[7689]: I0307 21:17:15.820787 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn"] Mar 07 21:17:15.833596 master-0 kubenswrapper[7689]: W0307 21:17:15.833548 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd9cf577_3c49_417b_a6c0_9d307c113221.slice/crio-a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376 WatchSource:0}: Error finding container a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376: Status 404 returned error can't find the container with id a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376 Mar 07 21:17:15.835234 master-0 kubenswrapper[7689]: I0307 21:17:15.835208 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2cc9"] Mar 07 21:17:15.839616 master-0 kubenswrapper[7689]: W0307 21:17:15.839595 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7f65054f_caf3_4cd3_889e_8d5a5376b1b8.slice/crio-9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab WatchSource:0}: Error finding container 9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab: Status 404 returned error can't find the container with id 9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab Mar 07 21:17:15.957622 master-0 kubenswrapper[7689]: I0307 21:17:15.957574 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-fdltd"] Mar 07 21:17:15.973953 master-0 kubenswrapper[7689]: I0307 21:17:15.973883 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74"] Mar 07 21:17:15.983174 master-0 kubenswrapper[7689]: W0307 21:17:15.983116 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5625eb9f_c80b_47b1_b70c_aa636fbc03ac.slice/crio-b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974 WatchSource:0}: Error finding container b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974: Status 404 returned error can't find the container with id b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974 Mar 07 21:17:15.983593 master-0 kubenswrapper[7689]: W0307 21:17:15.983560 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod85bb04ed_e2d1_496d_8f2c_9555bb3c5d78.slice/crio-f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed WatchSource:0}: Error finding container f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed: Status 404 returned error can't find the container with id f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed Mar 07 21:17:16.231120 master-0 kubenswrapper[7689]: I0307 21:17:16.230334 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv"] Mar 07 21:17:16.705330 master-0 kubenswrapper[7689]: I0307 21:17:16.705025 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" event={"ID":"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78","Type":"ContainerStarted","Data":"93bcef1c8be3962f1464e979cd049ada697fa306ecaf759a5e305b87fe2579c3"} Mar 07 21:17:16.705330 master-0 kubenswrapper[7689]: I0307 21:17:16.705085 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" event={"ID":"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78","Type":"ContainerStarted","Data":"f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed"} Mar 07 21:17:16.705330 master-0 kubenswrapper[7689]: I0307 21:17:16.705103 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" event={"ID":"bd9cf577-3c49-417b-a6c0-9d307c113221","Type":"ContainerStarted","Data":"a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376"} Mar 07 21:17:16.707206 master-0 kubenswrapper[7689]: I0307 21:17:16.707163 7689 generic.go:334] "Generic (PLEG): container finished" podID="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" containerID="35e0e5cfb37740a966e4bb6ed64ac7190b87360fa40dbcf877d2b0069b3065cd" exitCode=0 Mar 07 21:17:16.707268 master-0 kubenswrapper[7689]: I0307 21:17:16.707228 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2cc9" event={"ID":"7f65054f-caf3-4cd3-889e-8d5a5376b1b8","Type":"ContainerDied","Data":"35e0e5cfb37740a966e4bb6ed64ac7190b87360fa40dbcf877d2b0069b3065cd"} Mar 07 21:17:16.707268 master-0 kubenswrapper[7689]: I0307 21:17:16.707251 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2cc9" event={"ID":"7f65054f-caf3-4cd3-889e-8d5a5376b1b8","Type":"ContainerStarted","Data":"9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab"} Mar 07 21:17:16.720202 master-0 kubenswrapper[7689]: I0307 21:17:16.720002 7689 generic.go:334] "Generic (PLEG): container finished" podID="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" containerID="0b4bb2c8e80fc01b0e3b4c15c93598e07e450d614fd19ce1345979feccb7c709" exitCode=0 Mar 07 21:17:16.720202 master-0 kubenswrapper[7689]: I0307 21:17:16.720075 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdltd" event={"ID":"5625eb9f-c80b-47b1-b70c-aa636fbc03ac","Type":"ContainerDied","Data":"0b4bb2c8e80fc01b0e3b4c15c93598e07e450d614fd19ce1345979feccb7c709"} Mar 07 21:17:16.720202 master-0 kubenswrapper[7689]: I0307 21:17:16.720104 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdltd" event={"ID":"5625eb9f-c80b-47b1-b70c-aa636fbc03ac","Type":"ContainerStarted","Data":"b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974"} Mar 07 21:17:16.733647 master-0 kubenswrapper[7689]: I0307 21:17:16.733595 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" event={"ID":"7f69a884-5fe8-4c03-8258-ff35396efc30","Type":"ContainerStarted","Data":"19275991fadb14c56964bf46f1eed51983ab68593c8d84e2a21f6bb7a1d00a86"} Mar 07 21:17:16.733647 master-0 kubenswrapper[7689]: I0307 21:17:16.733653 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" event={"ID":"7f69a884-5fe8-4c03-8258-ff35396efc30","Type":"ContainerStarted","Data":"963c5d06828cca3d97d9a9b7afc31df05e56441fa9a4864d0679007c7b1f8c69"} Mar 07 21:17:16.733872 master-0 kubenswrapper[7689]: I0307 21:17:16.733674 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" event={"ID":"7f69a884-5fe8-4c03-8258-ff35396efc30","Type":"ContainerStarted","Data":"b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312"} Mar 07 21:17:16.738288 master-0 kubenswrapper[7689]: I0307 21:17:16.738230 7689 generic.go:334] "Generic (PLEG): container finished" podID="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" containerID="58159da9ebda8be7e4de56fce5a62c915f96081b06accf5dd15dc2fbdbd7247f" exitCode=0 Mar 07 21:17:16.738353 master-0 kubenswrapper[7689]: I0307 21:17:16.738331 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw59s" event={"ID":"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9","Type":"ContainerDied","Data":"58159da9ebda8be7e4de56fce5a62c915f96081b06accf5dd15dc2fbdbd7247f"} Mar 07 21:17:16.739427 master-0 kubenswrapper[7689]: I0307 21:17:16.739387 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" event={"ID":"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021","Type":"ContainerStarted","Data":"5567f1923dad84459fcc9068a666c7d7b21e33dc4f847dbb0c61779518830669"} Mar 07 21:17:17.043390 master-0 kubenswrapper[7689]: I0307 21:17:17.043160 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" podStartSLOduration=5.04309723 podStartE2EDuration="5.04309723s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:17.031228243 +0000 UTC m=+170.583555135" watchObservedRunningTime="2026-03-07 21:17:17.04309723 +0000 UTC m=+170.595424122" Mar 07 21:17:19.763799 master-0 kubenswrapper[7689]: I0307 21:17:19.763756 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kp74q"] Mar 07 21:17:19.765251 master-0 kubenswrapper[7689]: I0307 21:17:19.765234 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.768075 master-0 kubenswrapper[7689]: I0307 21:17:19.768047 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 21:17:19.768166 master-0 kubenswrapper[7689]: I0307 21:17:19.768067 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-w2xft" Mar 07 21:17:19.833710 master-0 kubenswrapper[7689]: I0307 21:17:19.833405 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cz8d\" (UniqueName: \"kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.833991 master-0 kubenswrapper[7689]: I0307 21:17:19.833793 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.833991 master-0 kubenswrapper[7689]: I0307 21:17:19.833883 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.833991 master-0 kubenswrapper[7689]: I0307 21:17:19.833926 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.935075 master-0 kubenswrapper[7689]: I0307 21:17:19.935012 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cz8d\" (UniqueName: \"kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.935075 master-0 kubenswrapper[7689]: I0307 21:17:19.935089 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.935448 master-0 kubenswrapper[7689]: I0307 21:17:19.935284 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.935448 master-0 kubenswrapper[7689]: I0307 21:17:19.935347 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.935582 master-0 kubenswrapper[7689]: I0307 21:17:19.935555 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.937740 master-0 kubenswrapper[7689]: I0307 21:17:19.937627 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.940303 master-0 kubenswrapper[7689]: I0307 21:17:19.940262 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:19.953546 master-0 kubenswrapper[7689]: I0307 21:17:19.953471 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cz8d\" (UniqueName: \"kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:20.089305 master-0 kubenswrapper[7689]: I0307 21:17:20.089161 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:22.264731 master-0 kubenswrapper[7689]: I0307 21:17:22.263247 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d"] Mar 07 21:17:24.466395 master-0 kubenswrapper[7689]: I0307 21:17:24.466327 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx"] Mar 07 21:17:24.467796 master-0 kubenswrapper[7689]: I0307 21:17:24.467767 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.482769 master-0 kubenswrapper[7689]: I0307 21:17:24.482707 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-6xbgq" Mar 07 21:17:24.483069 master-0 kubenswrapper[7689]: I0307 21:17:24.483029 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 21:17:24.488340 master-0 kubenswrapper[7689]: I0307 21:17:24.487941 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx"] Mar 07 21:17:24.671123 master-0 kubenswrapper[7689]: I0307 21:17:24.671043 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.671123 master-0 kubenswrapper[7689]: I0307 21:17:24.671126 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.671473 master-0 kubenswrapper[7689]: I0307 21:17:24.671161 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.671473 master-0 kubenswrapper[7689]: I0307 21:17:24.671196 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq99k\" (UniqueName: \"kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.773133 master-0 kubenswrapper[7689]: I0307 21:17:24.772965 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.773133 master-0 kubenswrapper[7689]: I0307 21:17:24.773026 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq99k\" (UniqueName: \"kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.773133 master-0 kubenswrapper[7689]: I0307 21:17:24.773115 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.773133 master-0 kubenswrapper[7689]: I0307 21:17:24.773147 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.774265 master-0 kubenswrapper[7689]: I0307 21:17:24.774240 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.778092 master-0 kubenswrapper[7689]: I0307 21:17:24.778053 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.778821 master-0 kubenswrapper[7689]: I0307 21:17:24.778778 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.798770 master-0 kubenswrapper[7689]: I0307 21:17:24.798718 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq99k\" (UniqueName: \"kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:24.814743 master-0 kubenswrapper[7689]: I0307 21:17:24.814700 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:38.909264 master-0 kubenswrapper[7689]: I0307 21:17:38.909172 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" event={"ID":"655b9f0a-cf27-443d-b0ea-3642dcae1ad2","Type":"ContainerStarted","Data":"112a83bbfd7da68fd7d98c9912932beebde7c37fe463c6524a512ede7b50dc89"} Mar 07 21:17:39.718181 master-0 kubenswrapper[7689]: I0307 21:17:39.718115 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx"] Mar 07 21:17:39.944734 master-0 kubenswrapper[7689]: I0307 21:17:39.944262 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdltd" event={"ID":"5625eb9f-c80b-47b1-b70c-aa636fbc03ac","Type":"ContainerStarted","Data":"2aa15de1b2bb19d02b5747770e2bfa186549430d792f08f2ab12bc57e19314a5"} Mar 07 21:17:39.956360 master-0 kubenswrapper[7689]: I0307 21:17:39.956083 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" event={"ID":"655b9f0a-cf27-443d-b0ea-3642dcae1ad2","Type":"ContainerStarted","Data":"9c35d34b990f0f0f014bf31acc6957c6a30b6adb77b3eb6ea46594257c1430b0"} Mar 07 21:17:39.961705 master-0 kubenswrapper[7689]: I0307 21:17:39.959296 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" event={"ID":"46d1b044-16fb-4442-a554-6b15a8a1c8ae","Type":"ContainerStarted","Data":"22bb9c50e586557c26e348d932ac5dad20b01bd083cb9c200964357361e20692"} Mar 07 21:17:39.964182 master-0 kubenswrapper[7689]: I0307 21:17:39.964142 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" event={"ID":"8512a7f6-889f-483e-960f-1ce3c834e92c","Type":"ContainerStarted","Data":"3fb440d4f7caaf03d1037e37f980ffca944b031e51e46e533c460fce8fda313b"} Mar 07 21:17:39.981351 master-0 kubenswrapper[7689]: I0307 21:17:39.980519 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2cc9" event={"ID":"7f65054f-caf3-4cd3-889e-8d5a5376b1b8","Type":"ContainerStarted","Data":"001b65d1db011b8bb96e16a7cb4e30e0932ee8b9e303affc470347e6b0c4af77"} Mar 07 21:17:39.998800 master-0 kubenswrapper[7689]: I0307 21:17:39.998397 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" podStartSLOduration=4.163577783 podStartE2EDuration="27.998366035s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.588144126 +0000 UTC m=+169.140471018" lastFinishedPulling="2026-03-07 21:17:39.422932368 +0000 UTC m=+192.975259270" observedRunningTime="2026-03-07 21:17:39.996809307 +0000 UTC m=+193.549136210" watchObservedRunningTime="2026-03-07 21:17:39.998366035 +0000 UTC m=+193.550692927" Mar 07 21:17:40.008134 master-0 kubenswrapper[7689]: I0307 21:17:40.007797 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" event={"ID":"bd9cf577-3c49-417b-a6c0-9d307c113221","Type":"ContainerStarted","Data":"8fa812b3126769f1b859d734a7a96fc03f149ac91f0eb8368e542c55f6e18fc4"} Mar 07 21:17:40.024617 master-0 kubenswrapper[7689]: I0307 21:17:40.023824 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw59s" event={"ID":"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9","Type":"ContainerStarted","Data":"87c8917d09451c4b2cd526c22f55e1e1633a895f47f1081055707fe4874946ed"} Mar 07 21:17:40.038979 master-0 kubenswrapper[7689]: I0307 21:17:40.036385 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="kube-rbac-proxy" containerID="cri-o://2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" gracePeriod=30 Mar 07 21:17:40.038979 master-0 kubenswrapper[7689]: I0307 21:17:40.036385 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerStarted","Data":"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7"} Mar 07 21:17:40.038979 master-0 kubenswrapper[7689]: I0307 21:17:40.036524 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="machine-approver-controller" containerID="cri-o://8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" gracePeriod=30 Mar 07 21:17:40.053142 master-0 kubenswrapper[7689]: I0307 21:17:40.053096 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" event={"ID":"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021","Type":"ContainerStarted","Data":"eb2620acea82b03d358eb877d4ce8f987e4e4ce7b9750792f77001fdf2ce62af"} Mar 07 21:17:40.106172 master-0 kubenswrapper[7689]: I0307 21:17:40.104108 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" podStartSLOduration=4.506199979 podStartE2EDuration="28.104069928s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.610856178 +0000 UTC m=+169.163183070" lastFinishedPulling="2026-03-07 21:17:39.208726117 +0000 UTC m=+192.761053019" observedRunningTime="2026-03-07 21:17:40.081607383 +0000 UTC m=+193.633934275" watchObservedRunningTime="2026-03-07 21:17:40.104069928 +0000 UTC m=+193.656396820" Mar 07 21:17:40.124373 master-0 kubenswrapper[7689]: I0307 21:17:40.112874 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" event={"ID":"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78","Type":"ContainerStarted","Data":"466b9a003b5f18babe67ec2250b78f2bb30c6822652916081396ed6ec2ac51ed"} Mar 07 21:17:40.124373 master-0 kubenswrapper[7689]: I0307 21:17:40.118766 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" event={"ID":"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d","Type":"ContainerStarted","Data":"5ba8d02efd97ab96c66d2e5a8c58f04777b536ec1ff43d8a222b2f0642623996"} Mar 07 21:17:40.124858 master-0 kubenswrapper[7689]: I0307 21:17:40.124778 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:40.146897 master-0 kubenswrapper[7689]: I0307 21:17:40.130017 7689 patch_prober.go:28] interesting pod/packageserver-f5bf97fcc-w82vx container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.67:5443/healthz\": dial tcp 10.128.0.67:5443: connect: connection refused" start-of-body= Mar 07 21:17:40.146897 master-0 kubenswrapper[7689]: I0307 21:17:40.130078 7689 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" podUID="c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d" containerName="packageserver" probeResult="failure" output="Get \"https://10.128.0.67:5443/healthz\": dial tcp 10.128.0.67:5443: connect: connection refused" Mar 07 21:17:40.146897 master-0 kubenswrapper[7689]: I0307 21:17:40.132277 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" event={"ID":"b12701eb-4226-4f9c-9398-ad0c3fea7451","Type":"ContainerStarted","Data":"d1fc671510809b5ce34fe6d8c109ba8c0532578d622b3287779a089fe73faa48"} Mar 07 21:17:40.178532 master-0 kubenswrapper[7689]: I0307 21:17:40.176763 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" podStartSLOduration=15.95369749 podStartE2EDuration="28.176736169s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.813617611 +0000 UTC m=+169.365944503" lastFinishedPulling="2026-03-07 21:17:28.03665625 +0000 UTC m=+181.588983182" observedRunningTime="2026-03-07 21:17:40.159769788 +0000 UTC m=+193.712096680" watchObservedRunningTime="2026-03-07 21:17:40.176736169 +0000 UTC m=+193.729063061" Mar 07 21:17:40.178532 master-0 kubenswrapper[7689]: I0307 21:17:40.177899 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" podStartSLOduration=15.979620728 podStartE2EDuration="28.177892007s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.838495454 +0000 UTC m=+169.390822346" lastFinishedPulling="2026-03-07 21:17:28.036766723 +0000 UTC m=+181.589093625" observedRunningTime="2026-03-07 21:17:40.13637135 +0000 UTC m=+193.688698252" watchObservedRunningTime="2026-03-07 21:17:40.177892007 +0000 UTC m=+193.730218899" Mar 07 21:17:40.224157 master-0 kubenswrapper[7689]: I0307 21:17:40.223377 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" podStartSLOduration=15.60583959 podStartE2EDuration="28.223165054s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.422057682 +0000 UTC m=+168.974384574" lastFinishedPulling="2026-03-07 21:17:28.039383136 +0000 UTC m=+181.591710038" observedRunningTime="2026-03-07 21:17:40.222850157 +0000 UTC m=+193.775177059" watchObservedRunningTime="2026-03-07 21:17:40.223165054 +0000 UTC m=+193.775491946" Mar 07 21:17:40.245754 master-0 kubenswrapper[7689]: I0307 21:17:40.244017 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-cwdkv_53e19dea-e8cb-478d-90da-3820712d6ac9/machine-approver-controller/0.log" Mar 07 21:17:40.245754 master-0 kubenswrapper[7689]: I0307 21:17:40.245268 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:40.296308 master-0 kubenswrapper[7689]: I0307 21:17:40.293731 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" podStartSLOduration=5.144722863 podStartE2EDuration="28.293706063s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:16.275573308 +0000 UTC m=+169.827900210" lastFinishedPulling="2026-03-07 21:17:39.424556488 +0000 UTC m=+192.976883410" observedRunningTime="2026-03-07 21:17:40.262135089 +0000 UTC m=+193.814461981" watchObservedRunningTime="2026-03-07 21:17:40.293706063 +0000 UTC m=+193.846032955" Mar 07 21:17:40.296308 master-0 kubenswrapper[7689]: I0307 21:17:40.294583 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" podStartSLOduration=16.294577775 podStartE2EDuration="16.294577775s" podCreationTimestamp="2026-03-07 21:17:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:40.293085959 +0000 UTC m=+193.845412861" watchObservedRunningTime="2026-03-07 21:17:40.294577775 +0000 UTC m=+193.846904667" Mar 07 21:17:40.329716 master-0 kubenswrapper[7689]: I0307 21:17:40.319584 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" podStartSLOduration=15.748542268 podStartE2EDuration="28.31954588s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.465569626 +0000 UTC m=+169.017896518" lastFinishedPulling="2026-03-07 21:17:28.036573198 +0000 UTC m=+181.588900130" observedRunningTime="2026-03-07 21:17:40.315325248 +0000 UTC m=+193.867652140" watchObservedRunningTime="2026-03-07 21:17:40.31954588 +0000 UTC m=+193.871872772" Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.432911 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config\") pod \"53e19dea-e8cb-478d-90da-3820712d6ac9\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.433003 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzg72\" (UniqueName: \"kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72\") pod \"53e19dea-e8cb-478d-90da-3820712d6ac9\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.433129 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls\") pod \"53e19dea-e8cb-478d-90da-3820712d6ac9\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.433205 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config\") pod \"53e19dea-e8cb-478d-90da-3820712d6ac9\" (UID: \"53e19dea-e8cb-478d-90da-3820712d6ac9\") " Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.433460 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config" (OuterVolumeSpecName: "config") pod "53e19dea-e8cb-478d-90da-3820712d6ac9" (UID: "53e19dea-e8cb-478d-90da-3820712d6ac9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:17:40.435711 master-0 kubenswrapper[7689]: I0307 21:17:40.434965 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "53e19dea-e8cb-478d-90da-3820712d6ac9" (UID: "53e19dea-e8cb-478d-90da-3820712d6ac9"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:17:40.440712 master-0 kubenswrapper[7689]: I0307 21:17:40.436935 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "53e19dea-e8cb-478d-90da-3820712d6ac9" (UID: "53e19dea-e8cb-478d-90da-3820712d6ac9"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:17:40.440712 master-0 kubenswrapper[7689]: I0307 21:17:40.437130 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72" (OuterVolumeSpecName: "kube-api-access-jzg72") pod "53e19dea-e8cb-478d-90da-3820712d6ac9" (UID: "53e19dea-e8cb-478d-90da-3820712d6ac9"). InnerVolumeSpecName "kube-api-access-jzg72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:17:40.536401 master-0 kubenswrapper[7689]: I0307 21:17:40.536267 7689 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:40.536401 master-0 kubenswrapper[7689]: I0307 21:17:40.536318 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/53e19dea-e8cb-478d-90da-3820712d6ac9-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:40.536401 master-0 kubenswrapper[7689]: I0307 21:17:40.536331 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzg72\" (UniqueName: \"kubernetes.io/projected/53e19dea-e8cb-478d-90da-3820712d6ac9-kube-api-access-jzg72\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:40.536401 master-0 kubenswrapper[7689]: I0307 21:17:40.536340 7689 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/53e19dea-e8cb-478d-90da-3820712d6ac9-machine-approver-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.142591 master-0 kubenswrapper[7689]: I0307 21:17:41.142498 7689 generic.go:334] "Generic (PLEG): container finished" podID="f08edf29-c53f-452d-880b-e8ce27b05b6f" containerID="f6257eda77e6ac921b623c45ec6e6f8e7833cf0c08e715e3b224823a05866040" exitCode=0 Mar 07 21:17:41.143118 master-0 kubenswrapper[7689]: I0307 21:17:41.142598 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxpb5" event={"ID":"f08edf29-c53f-452d-880b-e8ce27b05b6f","Type":"ContainerDied","Data":"f6257eda77e6ac921b623c45ec6e6f8e7833cf0c08e715e3b224823a05866040"} Mar 07 21:17:41.146358 master-0 kubenswrapper[7689]: I0307 21:17:41.146267 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" event={"ID":"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021","Type":"ContainerStarted","Data":"d511f782e68b68ab8f12beb4025a22972830b43430f1d3520ea5a557d0f760a6"} Mar 07 21:17:41.149041 master-0 kubenswrapper[7689]: I0307 21:17:41.149004 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerStarted","Data":"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd"} Mar 07 21:17:41.149041 master-0 kubenswrapper[7689]: I0307 21:17:41.149039 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerStarted","Data":"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a"} Mar 07 21:17:41.149137 master-0 kubenswrapper[7689]: I0307 21:17:41.149054 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerStarted","Data":"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0"} Mar 07 21:17:41.149137 master-0 kubenswrapper[7689]: I0307 21:17:41.149088 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="cluster-cloud-controller-manager" containerID="cri-o://641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" gracePeriod=30 Mar 07 21:17:41.149205 master-0 kubenswrapper[7689]: I0307 21:17:41.149124 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="kube-rbac-proxy" containerID="cri-o://099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" gracePeriod=30 Mar 07 21:17:41.149285 master-0 kubenswrapper[7689]: I0307 21:17:41.149104 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="config-sync-controllers" containerID="cri-o://acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" gracePeriod=30 Mar 07 21:17:41.152185 master-0 kubenswrapper[7689]: I0307 21:17:41.152127 7689 generic.go:334] "Generic (PLEG): container finished" podID="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" containerID="001b65d1db011b8bb96e16a7cb4e30e0932ee8b9e303affc470347e6b0c4af77" exitCode=0 Mar 07 21:17:41.152287 master-0 kubenswrapper[7689]: I0307 21:17:41.152220 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2cc9" event={"ID":"7f65054f-caf3-4cd3-889e-8d5a5376b1b8","Type":"ContainerDied","Data":"001b65d1db011b8bb96e16a7cb4e30e0932ee8b9e303affc470347e6b0c4af77"} Mar 07 21:17:41.158661 master-0 kubenswrapper[7689]: I0307 21:17:41.158581 7689 generic.go:334] "Generic (PLEG): container finished" podID="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" containerID="87c8917d09451c4b2cd526c22f55e1e1633a895f47f1081055707fe4874946ed" exitCode=0 Mar 07 21:17:41.158775 master-0 kubenswrapper[7689]: I0307 21:17:41.158720 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw59s" event={"ID":"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9","Type":"ContainerDied","Data":"87c8917d09451c4b2cd526c22f55e1e1633a895f47f1081055707fe4874946ed"} Mar 07 21:17:41.161457 master-0 kubenswrapper[7689]: I0307 21:17:41.161342 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-955fcfb87-cwdkv_53e19dea-e8cb-478d-90da-3820712d6ac9/machine-approver-controller/0.log" Mar 07 21:17:41.162637 master-0 kubenswrapper[7689]: I0307 21:17:41.162582 7689 generic.go:334] "Generic (PLEG): container finished" podID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerID="8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" exitCode=2 Mar 07 21:17:41.162701 master-0 kubenswrapper[7689]: I0307 21:17:41.162635 7689 generic.go:334] "Generic (PLEG): container finished" podID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerID="2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" exitCode=0 Mar 07 21:17:41.162805 master-0 kubenswrapper[7689]: I0307 21:17:41.162762 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerDied","Data":"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7"} Mar 07 21:17:41.162847 master-0 kubenswrapper[7689]: I0307 21:17:41.162824 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerDied","Data":"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3"} Mar 07 21:17:41.162937 master-0 kubenswrapper[7689]: I0307 21:17:41.162848 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" event={"ID":"53e19dea-e8cb-478d-90da-3820712d6ac9","Type":"ContainerDied","Data":"ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2"} Mar 07 21:17:41.163029 master-0 kubenswrapper[7689]: I0307 21:17:41.162952 7689 scope.go:117] "RemoveContainer" containerID="8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" Mar 07 21:17:41.163394 master-0 kubenswrapper[7689]: I0307 21:17:41.163323 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv" Mar 07 21:17:41.182370 master-0 kubenswrapper[7689]: I0307 21:17:41.182295 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" event={"ID":"655b9f0a-cf27-443d-b0ea-3642dcae1ad2","Type":"ContainerStarted","Data":"3466d10dd8ef236b1420a581e4ee68813efae23aabda31552a74f784024c4658"} Mar 07 21:17:41.207216 master-0 kubenswrapper[7689]: I0307 21:17:41.207150 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" event={"ID":"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d","Type":"ContainerStarted","Data":"03c8ae40f0340a840ace0bf0014c4459e75da62fa2319061ad07695b04090a03"} Mar 07 21:17:41.212726 master-0 kubenswrapper[7689]: I0307 21:17:41.212423 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" podStartSLOduration=12.259167086 podStartE2EDuration="29.21239119s" podCreationTimestamp="2026-03-07 21:17:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:14.914382187 +0000 UTC m=+168.466709069" lastFinishedPulling="2026-03-07 21:17:31.867606251 +0000 UTC m=+185.419933173" observedRunningTime="2026-03-07 21:17:41.196883873 +0000 UTC m=+194.749210785" watchObservedRunningTime="2026-03-07 21:17:41.21239119 +0000 UTC m=+194.764718092" Mar 07 21:17:41.218515 master-0 kubenswrapper[7689]: I0307 21:17:41.214879 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:41.218515 master-0 kubenswrapper[7689]: I0307 21:17:41.216138 7689 generic.go:334] "Generic (PLEG): container finished" podID="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" containerID="2aa15de1b2bb19d02b5747770e2bfa186549430d792f08f2ab12bc57e19314a5" exitCode=0 Mar 07 21:17:41.218515 master-0 kubenswrapper[7689]: I0307 21:17:41.216219 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdltd" event={"ID":"5625eb9f-c80b-47b1-b70c-aa636fbc03ac","Type":"ContainerDied","Data":"2aa15de1b2bb19d02b5747770e2bfa186549430d792f08f2ab12bc57e19314a5"} Mar 07 21:17:41.265660 master-0 kubenswrapper[7689]: I0307 21:17:41.265615 7689 scope.go:117] "RemoveContainer" containerID="2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" Mar 07 21:17:41.304117 master-0 kubenswrapper[7689]: I0307 21:17:41.303252 7689 scope.go:117] "RemoveContainer" containerID="8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" Mar 07 21:17:41.308974 master-0 kubenswrapper[7689]: E0307 21:17:41.308398 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7\": container with ID starting with 8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7 not found: ID does not exist" containerID="8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" Mar 07 21:17:41.308974 master-0 kubenswrapper[7689]: I0307 21:17:41.308459 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7"} err="failed to get container status \"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7\": rpc error: code = NotFound desc = could not find container \"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7\": container with ID starting with 8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7 not found: ID does not exist" Mar 07 21:17:41.308974 master-0 kubenswrapper[7689]: I0307 21:17:41.308487 7689 scope.go:117] "RemoveContainer" containerID="2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" Mar 07 21:17:41.308974 master-0 kubenswrapper[7689]: E0307 21:17:41.308961 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3\": container with ID starting with 2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3 not found: ID does not exist" containerID="2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" Mar 07 21:17:41.309137 master-0 kubenswrapper[7689]: I0307 21:17:41.308998 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3"} err="failed to get container status \"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3\": rpc error: code = NotFound desc = could not find container \"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3\": container with ID starting with 2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3 not found: ID does not exist" Mar 07 21:17:41.309137 master-0 kubenswrapper[7689]: I0307 21:17:41.309016 7689 scope.go:117] "RemoveContainer" containerID="8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7" Mar 07 21:17:41.309381 master-0 kubenswrapper[7689]: I0307 21:17:41.309209 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7"} err="failed to get container status \"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7\": rpc error: code = NotFound desc = could not find container \"8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7\": container with ID starting with 8f84599e26bc6cf6200c8503bcfb5b717ce61dbb3bc098b86f8dbb84f389eca7 not found: ID does not exist" Mar 07 21:17:41.309381 master-0 kubenswrapper[7689]: I0307 21:17:41.309247 7689 scope.go:117] "RemoveContainer" containerID="2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3" Mar 07 21:17:41.311160 master-0 kubenswrapper[7689]: I0307 21:17:41.309492 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3"} err="failed to get container status \"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3\": rpc error: code = NotFound desc = could not find container \"2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3\": container with ID starting with 2ba28e68526814e86dfda4705315487834c064c4ecb7c07ff6344731d17929f3 not found: ID does not exist" Mar 07 21:17:41.311382 master-0 kubenswrapper[7689]: I0307 21:17:41.311262 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" podStartSLOduration=22.311227895 podStartE2EDuration="22.311227895s" podCreationTimestamp="2026-03-07 21:17:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:41.309394461 +0000 UTC m=+194.861721363" watchObservedRunningTime="2026-03-07 21:17:41.311227895 +0000 UTC m=+194.863554787" Mar 07 21:17:41.355733 master-0 kubenswrapper[7689]: I0307 21:17:41.355494 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv"] Mar 07 21:17:41.358931 master-0 kubenswrapper[7689]: I0307 21:17:41.358870 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cluster-machine-approver/machine-approver-955fcfb87-cwdkv"] Mar 07 21:17:41.367483 master-0 kubenswrapper[7689]: I0307 21:17:41.367461 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:41.404350 master-0 kubenswrapper[7689]: I0307 21:17:41.404177 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l"] Mar 07 21:17:41.404597 master-0 kubenswrapper[7689]: E0307 21:17:41.404561 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="config-sync-controllers" Mar 07 21:17:41.404597 master-0 kubenswrapper[7689]: I0307 21:17:41.404582 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="config-sync-controllers" Mar 07 21:17:41.404665 master-0 kubenswrapper[7689]: E0307 21:17:41.404603 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="kube-rbac-proxy" Mar 07 21:17:41.404665 master-0 kubenswrapper[7689]: I0307 21:17:41.404616 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="kube-rbac-proxy" Mar 07 21:17:41.404665 master-0 kubenswrapper[7689]: E0307 21:17:41.404638 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="cluster-cloud-controller-manager" Mar 07 21:17:41.404665 master-0 kubenswrapper[7689]: I0307 21:17:41.404653 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="cluster-cloud-controller-manager" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: E0307 21:17:41.404673 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="machine-approver-controller" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: I0307 21:17:41.404710 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="machine-approver-controller" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: E0307 21:17:41.404734 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="kube-rbac-proxy" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: I0307 21:17:41.404748 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="kube-rbac-proxy" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: I0307 21:17:41.404907 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="kube-rbac-proxy" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: I0307 21:17:41.404928 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="machine-approver-controller" Mar 07 21:17:41.404946 master-0 kubenswrapper[7689]: I0307 21:17:41.404943 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" containerName="kube-rbac-proxy" Mar 07 21:17:41.405146 master-0 kubenswrapper[7689]: I0307 21:17:41.404968 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="cluster-cloud-controller-manager" Mar 07 21:17:41.405146 master-0 kubenswrapper[7689]: I0307 21:17:41.404990 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="df7c15cf-e017-478d-93bc-c7890876b383" containerName="config-sync-controllers" Mar 07 21:17:41.406002 master-0 kubenswrapper[7689]: I0307 21:17:41.405966 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.409183 master-0 kubenswrapper[7689]: I0307 21:17:41.409143 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 21:17:41.409285 master-0 kubenswrapper[7689]: I0307 21:17:41.409078 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 21:17:41.409365 master-0 kubenswrapper[7689]: I0307 21:17:41.409258 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 21:17:41.409402 master-0 kubenswrapper[7689]: I0307 21:17:41.409376 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fswfb" Mar 07 21:17:41.409433 master-0 kubenswrapper[7689]: I0307 21:17:41.409410 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 21:17:41.409433 master-0 kubenswrapper[7689]: I0307 21:17:41.409427 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 21:17:41.453616 master-0 kubenswrapper[7689]: I0307 21:17:41.453547 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls\") pod \"df7c15cf-e017-478d-93bc-c7890876b383\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " Mar 07 21:17:41.453824 master-0 kubenswrapper[7689]: I0307 21:17:41.453735 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config\") pod \"df7c15cf-e017-478d-93bc-c7890876b383\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " Mar 07 21:17:41.453884 master-0 kubenswrapper[7689]: I0307 21:17:41.453856 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg2ch\" (UniqueName: \"kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch\") pod \"df7c15cf-e017-478d-93bc-c7890876b383\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " Mar 07 21:17:41.454111 master-0 kubenswrapper[7689]: I0307 21:17:41.454070 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images\") pod \"df7c15cf-e017-478d-93bc-c7890876b383\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " Mar 07 21:17:41.454162 master-0 kubenswrapper[7689]: I0307 21:17:41.454128 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube\") pod \"df7c15cf-e017-478d-93bc-c7890876b383\" (UID: \"df7c15cf-e017-478d-93bc-c7890876b383\") " Mar 07 21:17:41.454445 master-0 kubenswrapper[7689]: I0307 21:17:41.454421 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.454490 master-0 kubenswrapper[7689]: I0307 21:17:41.454468 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.454532 master-0 kubenswrapper[7689]: I0307 21:17:41.454498 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.454532 master-0 kubenswrapper[7689]: I0307 21:17:41.454524 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpck7\" (UniqueName: \"kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.455723 master-0 kubenswrapper[7689]: I0307 21:17:41.455664 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube" (OuterVolumeSpecName: "host-etc-kube") pod "df7c15cf-e017-478d-93bc-c7890876b383" (UID: "df7c15cf-e017-478d-93bc-c7890876b383"). InnerVolumeSpecName "host-etc-kube". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:41.455829 master-0 kubenswrapper[7689]: I0307 21:17:41.455789 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "df7c15cf-e017-478d-93bc-c7890876b383" (UID: "df7c15cf-e017-478d-93bc-c7890876b383"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:17:41.456195 master-0 kubenswrapper[7689]: I0307 21:17:41.456169 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images" (OuterVolumeSpecName: "images") pod "df7c15cf-e017-478d-93bc-c7890876b383" (UID: "df7c15cf-e017-478d-93bc-c7890876b383"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:17:41.459311 master-0 kubenswrapper[7689]: I0307 21:17:41.459280 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls" (OuterVolumeSpecName: "cloud-controller-manager-operator-tls") pod "df7c15cf-e017-478d-93bc-c7890876b383" (UID: "df7c15cf-e017-478d-93bc-c7890876b383"). InnerVolumeSpecName "cloud-controller-manager-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:17:41.460406 master-0 kubenswrapper[7689]: I0307 21:17:41.460311 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch" (OuterVolumeSpecName: "kube-api-access-qg2ch") pod "df7c15cf-e017-478d-93bc-c7890876b383" (UID: "df7c15cf-e017-478d-93bc-c7890876b383"). InnerVolumeSpecName "kube-api-access-qg2ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:17:41.555424 master-0 kubenswrapper[7689]: I0307 21:17:41.555335 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.555424 master-0 kubenswrapper[7689]: I0307 21:17:41.555436 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555489 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpck7\" (UniqueName: \"kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555590 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555666 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg2ch\" (UniqueName: \"kubernetes.io/projected/df7c15cf-e017-478d-93bc-c7890876b383-kube-api-access-qg2ch\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555755 7689 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-images\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555775 7689 reconciler_common.go:293] "Volume detached for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/df7c15cf-e017-478d-93bc-c7890876b383-host-etc-kube\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555795 7689 reconciler_common.go:293] "Volume detached for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/df7c15cf-e017-478d-93bc-c7890876b383-cloud-controller-manager-operator-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.555822 master-0 kubenswrapper[7689]: I0307 21:17:41.555816 7689 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/df7c15cf-e017-478d-93bc-c7890876b383-auth-proxy-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:41.556945 master-0 kubenswrapper[7689]: I0307 21:17:41.556853 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.560522 master-0 kubenswrapper[7689]: I0307 21:17:41.560453 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.563815 master-0 kubenswrapper[7689]: I0307 21:17:41.563762 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.577575 master-0 kubenswrapper[7689]: I0307 21:17:41.577505 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpck7\" (UniqueName: \"kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.731319 master-0 kubenswrapper[7689]: I0307 21:17:41.728096 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:41.792703 master-0 kubenswrapper[7689]: W0307 21:17:41.790025 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3fe386a_dea8_484a_b95a_0f3f475b1f82.slice/crio-21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496 WatchSource:0}: Error finding container 21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496: Status 404 returned error can't find the container with id 21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496 Mar 07 21:17:42.237257 master-0 kubenswrapper[7689]: I0307 21:17:42.237161 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-fdltd" event={"ID":"5625eb9f-c80b-47b1-b70c-aa636fbc03ac","Type":"ContainerStarted","Data":"f4bf2595eda7a32c4b35dad40256a86d6f29ad36b9fcbc95dfd0a6260e36c00e"} Mar 07 21:17:42.256543 master-0 kubenswrapper[7689]: I0307 21:17:42.256084 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2cc9" event={"ID":"7f65054f-caf3-4cd3-889e-8d5a5376b1b8","Type":"ContainerStarted","Data":"673acca3db3e43394b0dc0449688320d0b77beba7c902cc7ddd3651a74066ba7"} Mar 07 21:17:42.272524 master-0 kubenswrapper[7689]: I0307 21:17:42.267121 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-fdltd" podStartSLOduration=13.389373823 podStartE2EDuration="38.267096543s" podCreationTimestamp="2026-03-07 21:17:04 +0000 UTC" firstStartedPulling="2026-03-07 21:17:16.728301311 +0000 UTC m=+170.280628213" lastFinishedPulling="2026-03-07 21:17:41.606023991 +0000 UTC m=+195.158350933" observedRunningTime="2026-03-07 21:17:42.260270068 +0000 UTC m=+195.812596980" watchObservedRunningTime="2026-03-07 21:17:42.267096543 +0000 UTC m=+195.819423435" Mar 07 21:17:42.272524 master-0 kubenswrapper[7689]: I0307 21:17:42.268106 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" event={"ID":"e3fe386a-dea8-484a-b95a-0f3f475b1f82","Type":"ContainerStarted","Data":"a4c03afd7e6edc85290545e45a60844139dbda95c108ef03d27f1ec99b647207"} Mar 07 21:17:42.272524 master-0 kubenswrapper[7689]: I0307 21:17:42.268138 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" event={"ID":"e3fe386a-dea8-484a-b95a-0f3f475b1f82","Type":"ContainerStarted","Data":"21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496"} Mar 07 21:17:42.272524 master-0 kubenswrapper[7689]: I0307 21:17:42.270901 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rw59s" event={"ID":"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9","Type":"ContainerStarted","Data":"ba4bc478efe541263e74d02c7f0bfbbf647b710b1c9e05a218205da40e773c81"} Mar 07 21:17:42.274735 master-0 kubenswrapper[7689]: I0307 21:17:42.274702 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-vxpb5" event={"ID":"f08edf29-c53f-452d-880b-e8ce27b05b6f","Type":"ContainerStarted","Data":"863c8731b52223c4e8ebe1a91fdc2c1813c3796e0cc837b0f13626da33053069"} Mar 07 21:17:42.277526 master-0 kubenswrapper[7689]: I0307 21:17:42.277486 7689 generic.go:334] "Generic (PLEG): container finished" podID="df7c15cf-e017-478d-93bc-c7890876b383" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" exitCode=0 Mar 07 21:17:42.277526 master-0 kubenswrapper[7689]: I0307 21:17:42.277520 7689 generic.go:334] "Generic (PLEG): container finished" podID="df7c15cf-e017-478d-93bc-c7890876b383" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" exitCode=0 Mar 07 21:17:42.277640 master-0 kubenswrapper[7689]: I0307 21:17:42.277530 7689 generic.go:334] "Generic (PLEG): container finished" podID="df7c15cf-e017-478d-93bc-c7890876b383" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" exitCode=0 Mar 07 21:17:42.277640 master-0 kubenswrapper[7689]: I0307 21:17:42.277578 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" Mar 07 21:17:42.277640 master-0 kubenswrapper[7689]: I0307 21:17:42.277629 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerDied","Data":"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd"} Mar 07 21:17:42.277828 master-0 kubenswrapper[7689]: I0307 21:17:42.277653 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerDied","Data":"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a"} Mar 07 21:17:42.277828 master-0 kubenswrapper[7689]: I0307 21:17:42.277664 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerDied","Data":"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0"} Mar 07 21:17:42.277828 master-0 kubenswrapper[7689]: I0307 21:17:42.277673 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d" event={"ID":"df7c15cf-e017-478d-93bc-c7890876b383","Type":"ContainerDied","Data":"e34e1ead14a1f46ae17df72bae6b5228c67e70a551aab1dde4319fa7bdff201f"} Mar 07 21:17:42.277828 master-0 kubenswrapper[7689]: I0307 21:17:42.277708 7689 scope.go:117] "RemoveContainer" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" Mar 07 21:17:42.287769 master-0 kubenswrapper[7689]: I0307 21:17:42.287665 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z2cc9" podStartSLOduration=14.395756869 podStartE2EDuration="39.287645672s" podCreationTimestamp="2026-03-07 21:17:03 +0000 UTC" firstStartedPulling="2026-03-07 21:17:16.712415255 +0000 UTC m=+170.264742147" lastFinishedPulling="2026-03-07 21:17:41.604304028 +0000 UTC m=+195.156630950" observedRunningTime="2026-03-07 21:17:42.286183056 +0000 UTC m=+195.838509968" watchObservedRunningTime="2026-03-07 21:17:42.287645672 +0000 UTC m=+195.839972564" Mar 07 21:17:42.319188 master-0 kubenswrapper[7689]: I0307 21:17:42.319088 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-vxpb5" podStartSLOduration=15.390841493 podStartE2EDuration="41.319052963s" podCreationTimestamp="2026-03-07 21:17:01 +0000 UTC" firstStartedPulling="2026-03-07 21:17:15.674216723 +0000 UTC m=+169.226543615" lastFinishedPulling="2026-03-07 21:17:41.602428193 +0000 UTC m=+195.154755085" observedRunningTime="2026-03-07 21:17:42.31726282 +0000 UTC m=+195.869589722" watchObservedRunningTime="2026-03-07 21:17:42.319052963 +0000 UTC m=+195.871379855" Mar 07 21:17:42.325955 master-0 kubenswrapper[7689]: I0307 21:17:42.325912 7689 scope.go:117] "RemoveContainer" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" Mar 07 21:17:42.345129 master-0 kubenswrapper[7689]: I0307 21:17:42.345051 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rw59s" podStartSLOduration=16.495529976 podStartE2EDuration="41.345030542s" podCreationTimestamp="2026-03-07 21:17:01 +0000 UTC" firstStartedPulling="2026-03-07 21:17:16.739615595 +0000 UTC m=+170.291942497" lastFinishedPulling="2026-03-07 21:17:41.589116171 +0000 UTC m=+195.141443063" observedRunningTime="2026-03-07 21:17:42.342807499 +0000 UTC m=+195.895134401" watchObservedRunningTime="2026-03-07 21:17:42.345030542 +0000 UTC m=+195.897357444" Mar 07 21:17:42.348072 master-0 kubenswrapper[7689]: I0307 21:17:42.347989 7689 scope.go:117] "RemoveContainer" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" Mar 07 21:17:42.370988 master-0 kubenswrapper[7689]: I0307 21:17:42.368518 7689 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d"] Mar 07 21:17:42.377758 master-0 kubenswrapper[7689]: I0307 21:17:42.377707 7689 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-559568b945-pmr9d"] Mar 07 21:17:42.384023 master-0 kubenswrapper[7689]: I0307 21:17:42.383966 7689 scope.go:117] "RemoveContainer" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" Mar 07 21:17:42.387992 master-0 kubenswrapper[7689]: E0307 21:17:42.387930 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": container with ID starting with 099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd not found: ID does not exist" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" Mar 07 21:17:42.388061 master-0 kubenswrapper[7689]: I0307 21:17:42.388004 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd"} err="failed to get container status \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": rpc error: code = NotFound desc = could not find container \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": container with ID starting with 099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd not found: ID does not exist" Mar 07 21:17:42.388061 master-0 kubenswrapper[7689]: I0307 21:17:42.388049 7689 scope.go:117] "RemoveContainer" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" Mar 07 21:17:42.388737 master-0 kubenswrapper[7689]: E0307 21:17:42.388637 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": container with ID starting with acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a not found: ID does not exist" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" Mar 07 21:17:42.388806 master-0 kubenswrapper[7689]: I0307 21:17:42.388727 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a"} err="failed to get container status \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": rpc error: code = NotFound desc = could not find container \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": container with ID starting with acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a not found: ID does not exist" Mar 07 21:17:42.388854 master-0 kubenswrapper[7689]: I0307 21:17:42.388812 7689 scope.go:117] "RemoveContainer" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" Mar 07 21:17:42.389335 master-0 kubenswrapper[7689]: E0307 21:17:42.389310 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": container with ID starting with 641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0 not found: ID does not exist" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" Mar 07 21:17:42.389392 master-0 kubenswrapper[7689]: I0307 21:17:42.389334 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0"} err="failed to get container status \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": rpc error: code = NotFound desc = could not find container \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": container with ID starting with 641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0 not found: ID does not exist" Mar 07 21:17:42.389392 master-0 kubenswrapper[7689]: I0307 21:17:42.389349 7689 scope.go:117] "RemoveContainer" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" Mar 07 21:17:42.389736 master-0 kubenswrapper[7689]: I0307 21:17:42.389655 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd"} err="failed to get container status \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": rpc error: code = NotFound desc = could not find container \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": container with ID starting with 099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd not found: ID does not exist" Mar 07 21:17:42.389736 master-0 kubenswrapper[7689]: I0307 21:17:42.389696 7689 scope.go:117] "RemoveContainer" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" Mar 07 21:17:42.391841 master-0 kubenswrapper[7689]: I0307 21:17:42.391785 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a"} err="failed to get container status \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": rpc error: code = NotFound desc = could not find container \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": container with ID starting with acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a not found: ID does not exist" Mar 07 21:17:42.391841 master-0 kubenswrapper[7689]: I0307 21:17:42.391825 7689 scope.go:117] "RemoveContainer" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" Mar 07 21:17:42.393020 master-0 kubenswrapper[7689]: I0307 21:17:42.392951 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0"} err="failed to get container status \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": rpc error: code = NotFound desc = could not find container \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": container with ID starting with 641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0 not found: ID does not exist" Mar 07 21:17:42.393020 master-0 kubenswrapper[7689]: I0307 21:17:42.393016 7689 scope.go:117] "RemoveContainer" containerID="099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd" Mar 07 21:17:42.395952 master-0 kubenswrapper[7689]: I0307 21:17:42.395902 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd"} err="failed to get container status \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": rpc error: code = NotFound desc = could not find container \"099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd\": container with ID starting with 099fb87c2b33163532b12ba8325deb517c674053b332cc0cd3b083333c7e9fcd not found: ID does not exist" Mar 07 21:17:42.395952 master-0 kubenswrapper[7689]: I0307 21:17:42.395941 7689 scope.go:117] "RemoveContainer" containerID="acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a" Mar 07 21:17:42.396315 master-0 kubenswrapper[7689]: I0307 21:17:42.396281 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a"} err="failed to get container status \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": rpc error: code = NotFound desc = could not find container \"acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a\": container with ID starting with acae2a2bb0459f99f7db0dbddc3570b5e8e1688202831d36ce6cf7c89a813a9a not found: ID does not exist" Mar 07 21:17:42.396315 master-0 kubenswrapper[7689]: I0307 21:17:42.396310 7689 scope.go:117] "RemoveContainer" containerID="641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0" Mar 07 21:17:42.396619 master-0 kubenswrapper[7689]: I0307 21:17:42.396575 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0"} err="failed to get container status \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": rpc error: code = NotFound desc = could not find container \"641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0\": container with ID starting with 641cf71185bc96d5e5d39dcb7357812e0ed5a5abb213f466c8334bea05de63c0 not found: ID does not exist" Mar 07 21:17:42.415252 master-0 kubenswrapper[7689]: I0307 21:17:42.415180 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j"] Mar 07 21:17:42.416658 master-0 kubenswrapper[7689]: I0307 21:17:42.416622 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.421800 master-0 kubenswrapper[7689]: I0307 21:17:42.421644 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 07 21:17:42.421911 master-0 kubenswrapper[7689]: I0307 21:17:42.421870 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:17:42.422179 master-0 kubenswrapper[7689]: I0307 21:17:42.422143 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:17:42.422269 master-0 kubenswrapper[7689]: I0307 21:17:42.422237 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 07 21:17:42.422366 master-0 kubenswrapper[7689]: I0307 21:17:42.422315 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 07 21:17:42.422460 master-0 kubenswrapper[7689]: I0307 21:17:42.422438 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lvvbn" Mar 07 21:17:42.570629 master-0 kubenswrapper[7689]: I0307 21:17:42.570477 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.570629 master-0 kubenswrapper[7689]: I0307 21:17:42.570605 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgkz\" (UniqueName: \"kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.570872 master-0 kubenswrapper[7689]: I0307 21:17:42.570656 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.570872 master-0 kubenswrapper[7689]: I0307 21:17:42.570842 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.570941 master-0 kubenswrapper[7689]: I0307 21:17:42.570882 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672226 master-0 kubenswrapper[7689]: I0307 21:17:42.672138 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672226 master-0 kubenswrapper[7689]: I0307 21:17:42.672209 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672555 master-0 kubenswrapper[7689]: I0307 21:17:42.672269 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672555 master-0 kubenswrapper[7689]: I0307 21:17:42.672291 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgkz\" (UniqueName: \"kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672555 master-0 kubenswrapper[7689]: I0307 21:17:42.672319 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.672555 master-0 kubenswrapper[7689]: I0307 21:17:42.672503 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.673492 master-0 kubenswrapper[7689]: I0307 21:17:42.673434 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.673713 master-0 kubenswrapper[7689]: I0307 21:17:42.673635 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.675722 master-0 kubenswrapper[7689]: I0307 21:17:42.675646 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.689452 master-0 kubenswrapper[7689]: I0307 21:17:42.689369 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgkz\" (UniqueName: \"kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.692938 master-0 kubenswrapper[7689]: I0307 21:17:42.692886 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="53e19dea-e8cb-478d-90da-3820712d6ac9" path="/var/lib/kubelet/pods/53e19dea-e8cb-478d-90da-3820712d6ac9/volumes" Mar 07 21:17:42.693527 master-0 kubenswrapper[7689]: I0307 21:17:42.693496 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df7c15cf-e017-478d-93bc-c7890876b383" path="/var/lib/kubelet/pods/df7c15cf-e017-478d-93bc-c7890876b383/volumes" Mar 07 21:17:42.739562 master-0 kubenswrapper[7689]: I0307 21:17:42.739493 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:42.763222 master-0 kubenswrapper[7689]: W0307 21:17:42.763160 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca25117a_ccd5_4628_8342_e277bb7be0e2.slice/crio-335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663 WatchSource:0}: Error finding container 335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663: Status 404 returned error can't find the container with id 335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663 Mar 07 21:17:43.286377 master-0 kubenswrapper[7689]: I0307 21:17:43.286318 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" event={"ID":"e3fe386a-dea8-484a-b95a-0f3f475b1f82","Type":"ContainerStarted","Data":"e1ff3eae48c3ff1b893f6264aeabf20e527e5a2aada9c5ff0d41a2697e563623"} Mar 07 21:17:43.287924 master-0 kubenswrapper[7689]: I0307 21:17:43.287888 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"7b4597e52188f5c573f68426c4f78eaba88e1097110bf160f774be6cf11820b0"} Mar 07 21:17:43.287924 master-0 kubenswrapper[7689]: I0307 21:17:43.287918 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663"} Mar 07 21:17:43.315976 master-0 kubenswrapper[7689]: I0307 21:17:43.315896 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" podStartSLOduration=2.315871513 podStartE2EDuration="2.315871513s" podCreationTimestamp="2026-03-07 21:17:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:43.314040679 +0000 UTC m=+196.866367571" watchObservedRunningTime="2026-03-07 21:17:43.315871513 +0000 UTC m=+196.868198415" Mar 07 21:17:44.298950 master-0 kubenswrapper[7689]: I0307 21:17:44.298885 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"6f6092d3dc0e43917716030f9f9cb2a93ee00bd142f787bac59b9720fa04a34f"} Mar 07 21:17:44.299587 master-0 kubenswrapper[7689]: I0307 21:17:44.298963 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"dd4a42b20d0889c922e1f9c5f727fca2b250feadbb1f3cfd4fdc17a9825b9a9e"} Mar 07 21:17:44.403719 master-0 kubenswrapper[7689]: I0307 21:17:44.403596 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" podStartSLOduration=2.403568756 podStartE2EDuration="2.403568756s" podCreationTimestamp="2026-03-07 21:17:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:44.32453727 +0000 UTC m=+197.876864202" watchObservedRunningTime="2026-03-07 21:17:44.403568756 +0000 UTC m=+197.955895688" Mar 07 21:17:44.404784 master-0 kubenswrapper[7689]: I0307 21:17:44.404740 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v"] Mar 07 21:17:44.406176 master-0 kubenswrapper[7689]: I0307 21:17:44.406137 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.410137 master-0 kubenswrapper[7689]: I0307 21:17:44.410054 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-z5sb9" Mar 07 21:17:44.410435 master-0 kubenswrapper[7689]: I0307 21:17:44.410245 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 21:17:44.419666 master-0 kubenswrapper[7689]: I0307 21:17:44.419588 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v"] Mar 07 21:17:44.507299 master-0 kubenswrapper[7689]: I0307 21:17:44.502317 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.507299 master-0 kubenswrapper[7689]: I0307 21:17:44.502433 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.507299 master-0 kubenswrapper[7689]: I0307 21:17:44.502525 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2gv7\" (UniqueName: \"kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.603880 master-0 kubenswrapper[7689]: I0307 21:17:44.603793 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.604202 master-0 kubenswrapper[7689]: I0307 21:17:44.604130 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.604310 master-0 kubenswrapper[7689]: I0307 21:17:44.604269 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gv7\" (UniqueName: \"kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.608939 master-0 kubenswrapper[7689]: I0307 21:17:44.605175 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.610075 master-0 kubenswrapper[7689]: I0307 21:17:44.610015 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.638170 master-0 kubenswrapper[7689]: I0307 21:17:44.638095 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gv7\" (UniqueName: \"kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.731417 master-0 kubenswrapper[7689]: I0307 21:17:44.731306 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:44.949497 master-0 kubenswrapper[7689]: I0307 21:17:44.949403 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:44.949497 master-0 kubenswrapper[7689]: I0307 21:17:44.949493 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:44.982902 master-0 kubenswrapper[7689]: I0307 21:17:44.982829 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:44.984254 master-0 kubenswrapper[7689]: I0307 21:17:44.984212 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:44.996169 master-0 kubenswrapper[7689]: I0307 21:17:44.996123 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:45.050778 master-0 kubenswrapper[7689]: I0307 21:17:45.050712 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:45.083814 master-0 kubenswrapper[7689]: I0307 21:17:45.083737 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:45.083814 master-0 kubenswrapper[7689]: I0307 21:17:45.083807 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:45.113384 master-0 kubenswrapper[7689]: I0307 21:17:45.113306 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:45.113384 master-0 kubenswrapper[7689]: I0307 21:17:45.113355 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:45.125654 master-0 kubenswrapper[7689]: I0307 21:17:45.125576 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:45.374517 master-0 kubenswrapper[7689]: I0307 21:17:45.371563 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v"] Mar 07 21:17:45.381999 master-0 kubenswrapper[7689]: W0307 21:17:45.381172 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5446df8b_23d4_4bf3_84ac_d8e1d18813af.slice/crio-838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9 WatchSource:0}: Error finding container 838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9: Status 404 returned error can't find the container with id 838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9 Mar 07 21:17:45.702458 master-0 kubenswrapper[7689]: I0307 21:17:45.702386 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml"] Mar 07 21:17:45.703615 master-0 kubenswrapper[7689]: I0307 21:17:45.703588 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:45.705550 master-0 kubenswrapper[7689]: I0307 21:17:45.705498 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr"] Mar 07 21:17:45.706617 master-0 kubenswrapper[7689]: I0307 21:17:45.706575 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:45.707961 master-0 kubenswrapper[7689]: W0307 21:17:45.707924 7689 reflector.go:561] object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls": failed to list *v1.Secret: secrets "prometheus-operator-admission-webhook-tls" is forbidden: User "system:node:master-0" cannot list resource "secrets" in API group "" in the namespace "openshift-monitoring": no relationship found between node 'master-0' and this object Mar 07 21:17:45.708030 master-0 kubenswrapper[7689]: E0307 21:17:45.707979 7689 reflector.go:158] "Unhandled Error" err="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"prometheus-operator-admission-webhook-tls\" is forbidden: User \"system:node:master-0\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-monitoring\": no relationship found between node 'master-0' and this object" logger="UnhandledError" Mar 07 21:17:45.710973 master-0 kubenswrapper[7689]: I0307 21:17:45.710919 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-79f8cd6fdd-858hg"] Mar 07 21:17:45.711937 master-0 kubenswrapper[7689]: I0307 21:17:45.711903 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.713763 master-0 kubenswrapper[7689]: I0307 21:17:45.713706 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 21:17:45.713763 master-0 kubenswrapper[7689]: I0307 21:17:45.713741 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 21:17:45.714287 master-0 kubenswrapper[7689]: I0307 21:17:45.714250 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 21:17:45.727973 master-0 kubenswrapper[7689]: I0307 21:17:45.727911 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 21:17:45.728206 master-0 kubenswrapper[7689]: I0307 21:17:45.728009 7689 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 21:17:45.728206 master-0 kubenswrapper[7689]: I0307 21:17:45.728044 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 21:17:45.729486 master-0 kubenswrapper[7689]: I0307 21:17:45.729412 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr"] Mar 07 21:17:45.735728 master-0 kubenswrapper[7689]: I0307 21:17:45.735546 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml"] Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821602 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821651 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821704 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821735 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wps6\" (UniqueName: \"kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6\") pod \"network-check-source-7c67b67d47-88mpr\" (UID: \"6d5765e6-80cc-404b-b375-c109febd1843\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821759 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.821941 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpjms\" (UniqueName: \"kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.822625 master-0 kubenswrapper[7689]: I0307 21:17:45.822283 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:45.923982 master-0 kubenswrapper[7689]: I0307 21:17:45.923783 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:45.923982 master-0 kubenswrapper[7689]: I0307 21:17:45.923936 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.923982 master-0 kubenswrapper[7689]: I0307 21:17:45.923980 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.924308 master-0 kubenswrapper[7689]: I0307 21:17:45.924064 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.924308 master-0 kubenswrapper[7689]: I0307 21:17:45.924147 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wps6\" (UniqueName: \"kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6\") pod \"network-check-source-7c67b67d47-88mpr\" (UID: \"6d5765e6-80cc-404b-b375-c109febd1843\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:45.924663 master-0 kubenswrapper[7689]: I0307 21:17:45.924585 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.924866 master-0 kubenswrapper[7689]: I0307 21:17:45.924822 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjms\" (UniqueName: \"kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.925781 master-0 kubenswrapper[7689]: I0307 21:17:45.925722 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.931318 master-0 kubenswrapper[7689]: I0307 21:17:45.931265 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.931863 master-0 kubenswrapper[7689]: I0307 21:17:45.931813 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.932832 master-0 kubenswrapper[7689]: I0307 21:17:45.932760 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:45.945870 master-0 kubenswrapper[7689]: I0307 21:17:45.945817 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wps6\" (UniqueName: \"kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6\") pod \"network-check-source-7c67b67d47-88mpr\" (UID: \"6d5765e6-80cc-404b-b375-c109febd1843\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:45.955385 master-0 kubenswrapper[7689]: I0307 21:17:45.955331 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjms\" (UniqueName: \"kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:46.054040 master-0 kubenswrapper[7689]: I0307 21:17:46.053953 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:46.075672 master-0 kubenswrapper[7689]: I0307 21:17:46.075555 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:46.152923 master-0 kubenswrapper[7689]: I0307 21:17:46.152818 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-fdltd" podUID="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" containerName="registry-server" probeResult="failure" output=< Mar 07 21:17:46.152923 master-0 kubenswrapper[7689]: timeout: failed to connect service ":50051" within 1s Mar 07 21:17:46.152923 master-0 kubenswrapper[7689]: > Mar 07 21:17:46.318553 master-0 kubenswrapper[7689]: I0307 21:17:46.318497 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" event={"ID":"d50f92ea-1c78-4535-a14c-96b00f2cf377","Type":"ContainerStarted","Data":"1ca4880cca3c21e3d7b1cda1ce4ee79b5948da96d4adeaa90fb0268e490efa53"} Mar 07 21:17:46.327474 master-0 kubenswrapper[7689]: I0307 21:17:46.327170 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" event={"ID":"5446df8b-23d4-4bf3-84ac-d8e1d18813af","Type":"ContainerStarted","Data":"3ab9dd805d9b79497b92326abb160474bcd3902ecfdefb910d4b32f4b19caf62"} Mar 07 21:17:46.327474 master-0 kubenswrapper[7689]: I0307 21:17:46.327217 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" event={"ID":"5446df8b-23d4-4bf3-84ac-d8e1d18813af","Type":"ContainerStarted","Data":"404e2d81c63c2fba1b320a0b186d8fcfd12c559ca83bb773f1ba38fc1d224277"} Mar 07 21:17:46.327474 master-0 kubenswrapper[7689]: I0307 21:17:46.327230 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" event={"ID":"5446df8b-23d4-4bf3-84ac-d8e1d18813af","Type":"ContainerStarted","Data":"838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9"} Mar 07 21:17:46.365522 master-0 kubenswrapper[7689]: I0307 21:17:46.363852 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" podStartSLOduration=2.363812556 podStartE2EDuration="2.363812556s" podCreationTimestamp="2026-03-07 21:17:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:46.352437231 +0000 UTC m=+199.904764143" watchObservedRunningTime="2026-03-07 21:17:46.363812556 +0000 UTC m=+199.916139538" Mar 07 21:17:46.382570 master-0 kubenswrapper[7689]: I0307 21:17:46.382496 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:46.576575 master-0 kubenswrapper[7689]: I0307 21:17:46.576415 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr"] Mar 07 21:17:46.924394 master-0 kubenswrapper[7689]: E0307 21:17:46.924309 7689 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:46.924640 master-0 kubenswrapper[7689]: E0307 21:17:46.924467 7689 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates podName:9515e34b-addf-487a-adf8-c6ef24fcc54c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:47.424436894 +0000 UTC m=+200.976763796 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates") pod "prometheus-operator-admission-webhook-8464df8497-lxzml" (UID: "9515e34b-addf-487a-adf8-c6ef24fcc54c") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:47.187282 master-0 kubenswrapper[7689]: I0307 21:17:47.187137 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 07 21:17:47.339649 master-0 kubenswrapper[7689]: I0307 21:17:47.339504 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" event={"ID":"6d5765e6-80cc-404b-b375-c109febd1843","Type":"ContainerStarted","Data":"7f4da5ed7bb1a9dae36be6515e42a275d07c99bb0c42ed31a0c22b86bbca489d"} Mar 07 21:17:47.339649 master-0 kubenswrapper[7689]: I0307 21:17:47.339570 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" event={"ID":"6d5765e6-80cc-404b-b375-c109febd1843","Type":"ContainerStarted","Data":"0d6ce5c921f4c23cf75893970c3672194512ea3e6e2c3df0b77494942ff24a81"} Mar 07 21:17:47.364915 master-0 kubenswrapper[7689]: I0307 21:17:47.364778 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" podStartSLOduration=246.364746446 podStartE2EDuration="4m6.364746446s" podCreationTimestamp="2026-03-07 21:13:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:47.358472623 +0000 UTC m=+200.910799585" watchObservedRunningTime="2026-03-07 21:17:47.364746446 +0000 UTC m=+200.917073378" Mar 07 21:17:47.445775 master-0 kubenswrapper[7689]: I0307 21:17:47.445373 7689 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:17:47.487713 master-0 kubenswrapper[7689]: I0307 21:17:47.487419 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:47.493113 master-0 kubenswrapper[7689]: I0307 21:17:47.493047 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:47.535318 master-0 kubenswrapper[7689]: I0307 21:17:47.535250 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:48.032121 master-0 kubenswrapper[7689]: I0307 21:17:48.031789 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-7w8wf_ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/authentication-operator/0.log" Mar 07 21:17:48.226388 master-0 kubenswrapper[7689]: I0307 21:17:48.226323 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication-operator_authentication-operator-7c6989d6c4-7w8wf_ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/authentication-operator/1.log" Mar 07 21:17:48.626692 master-0 kubenswrapper[7689]: I0307 21:17:48.626625 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67cf6dffcb-4z6hx_7d462ed3-d191-42a5-b8e0-79ab9af13991/fix-audit-permissions/0.log" Mar 07 21:17:48.832708 master-0 kubenswrapper[7689]: I0307 21:17:48.832500 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-oauth-apiserver_apiserver-67cf6dffcb-4z6hx_7d462ed3-d191-42a5-b8e0-79ab9af13991/oauth-apiserver/0.log" Mar 07 21:17:48.920423 master-0 kubenswrapper[7689]: I0307 21:17:48.920352 7689 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml"] Mar 07 21:17:48.923487 master-0 kubenswrapper[7689]: W0307 21:17:48.923405 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9515e34b_addf_487a_adf8_c6ef24fcc54c.slice/crio-1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd WatchSource:0}: Error finding container 1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd: Status 404 returned error can't find the container with id 1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd Mar 07 21:17:49.026983 master-0 kubenswrapper[7689]: I0307 21:17:49.026900 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-lc94h_5f82d4aa-0cb5-477f-944e-745a21d124fc/etcd-operator/0.log" Mar 07 21:17:49.221717 master-0 kubenswrapper[7689]: I0307 21:17:49.221632 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd-operator_etcd-operator-5884b9cd56-lc94h_5f82d4aa-0cb5-477f-944e-745a21d124fc/etcd-operator/1.log" Mar 07 21:17:49.351126 master-0 kubenswrapper[7689]: I0307 21:17:49.351048 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" event={"ID":"d50f92ea-1c78-4535-a14c-96b00f2cf377","Type":"ContainerStarted","Data":"2152743e32ae286ca8e6d81bc3587535eea3dd4738f772ed85e31e21147b7d5e"} Mar 07 21:17:49.352668 master-0 kubenswrapper[7689]: I0307 21:17:49.352601 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" event={"ID":"9515e34b-addf-487a-adf8-c6ef24fcc54c","Type":"ContainerStarted","Data":"1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd"} Mar 07 21:17:49.384265 master-0 kubenswrapper[7689]: I0307 21:17:49.384161 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podStartSLOduration=155.012699263 podStartE2EDuration="2m37.384135019s" podCreationTimestamp="2026-03-07 21:15:12 +0000 UTC" firstStartedPulling="2026-03-07 21:17:46.126781101 +0000 UTC m=+199.679108033" lastFinishedPulling="2026-03-07 21:17:48.498216857 +0000 UTC m=+202.050543789" observedRunningTime="2026-03-07 21:17:49.378073553 +0000 UTC m=+202.930400485" watchObservedRunningTime="2026-03-07 21:17:49.384135019 +0000 UTC m=+202.936461941" Mar 07 21:17:49.426036 master-0 kubenswrapper[7689]: I0307 21:17:49.425923 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/setup/0.log" Mar 07 21:17:49.620906 master-0 kubenswrapper[7689]: I0307 21:17:49.620855 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-ensure-env-vars/0.log" Mar 07 21:17:49.824973 master-0 kubenswrapper[7689]: I0307 21:17:49.823772 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-resources-copy/0.log" Mar 07 21:17:49.871105 master-0 kubenswrapper[7689]: I0307 21:17:49.870922 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-xskwx"] Mar 07 21:17:49.873106 master-0 kubenswrapper[7689]: I0307 21:17:49.871947 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:49.874837 master-0 kubenswrapper[7689]: I0307 21:17:49.874362 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-x6w69" Mar 07 21:17:49.874837 master-0 kubenswrapper[7689]: I0307 21:17:49.874529 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 21:17:49.875916 master-0 kubenswrapper[7689]: I0307 21:17:49.875857 7689 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 21:17:49.926047 master-0 kubenswrapper[7689]: I0307 21:17:49.925984 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bqlq\" (UniqueName: \"kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:49.926320 master-0 kubenswrapper[7689]: I0307 21:17:49.926066 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:49.926320 master-0 kubenswrapper[7689]: I0307 21:17:49.926096 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.021595 master-0 kubenswrapper[7689]: I0307 21:17:50.021543 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 07 21:17:50.027397 master-0 kubenswrapper[7689]: I0307 21:17:50.027328 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.027513 master-0 kubenswrapper[7689]: I0307 21:17:50.027422 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.027579 master-0 kubenswrapper[7689]: I0307 21:17:50.027530 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqlq\" (UniqueName: \"kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.032929 master-0 kubenswrapper[7689]: I0307 21:17:50.032889 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.033095 master-0 kubenswrapper[7689]: I0307 21:17:50.033036 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.052462 master-0 kubenswrapper[7689]: I0307 21:17:50.052411 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqlq\" (UniqueName: \"kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.075785 master-0 kubenswrapper[7689]: I0307 21:17:50.075722 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:50.078984 master-0 kubenswrapper[7689]: I0307 21:17:50.078926 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:50.078984 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:50.078984 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:50.078984 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:50.079268 master-0 kubenswrapper[7689]: I0307 21:17:50.079024 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:50.193633 master-0 kubenswrapper[7689]: I0307 21:17:50.193456 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:50.233329 master-0 kubenswrapper[7689]: I0307 21:17:50.233261 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 07 21:17:50.360489 master-0 kubenswrapper[7689]: I0307 21:17:50.360385 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xskwx" event={"ID":"599c055c-3517-46cb-b584-0050b12a7dea","Type":"ContainerStarted","Data":"a976805a261b43c3cbc596829459288339bb9f57afae203909e8153931024f4e"} Mar 07 21:17:50.426783 master-0 kubenswrapper[7689]: I0307 21:17:50.426731 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 07 21:17:50.622984 master-0 kubenswrapper[7689]: I0307 21:17:50.622919 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-readyz/0.log" Mar 07 21:17:51.079145 master-0 kubenswrapper[7689]: I0307 21:17:51.078990 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:51.079145 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:51.079145 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:51.079145 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:51.079145 master-0 kubenswrapper[7689]: I0307 21:17:51.079065 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:51.168969 master-0 kubenswrapper[7689]: I0307 21:17:51.168900 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 07 21:17:51.371519 master-0 kubenswrapper[7689]: I0307 21:17:51.371424 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-xskwx" event={"ID":"599c055c-3517-46cb-b584-0050b12a7dea","Type":"ContainerStarted","Data":"6321378f8b8336a28fe672eb992588a3dc313cc8e83b8d71aac0b7324e37db5c"} Mar 07 21:17:51.373349 master-0 kubenswrapper[7689]: I0307 21:17:51.373265 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" event={"ID":"9515e34b-addf-487a-adf8-c6ef24fcc54c","Type":"ContainerStarted","Data":"29b6719f5ef8afa08d2d3897eb17c38d53f2a7b0b8857d549e62a631348adb5a"} Mar 07 21:17:51.373670 master-0 kubenswrapper[7689]: I0307 21:17:51.373626 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:51.380771 master-0 kubenswrapper[7689]: I0307 21:17:51.380715 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:51.792403 master-0 kubenswrapper[7689]: I0307 21:17:51.792171 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_installer-1-master-0_e757a93e-91aa-4fce-949b-4c51a060528e/installer/0.log" Mar 07 21:17:51.793127 master-0 kubenswrapper[7689]: I0307 21:17:51.792730 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-xskwx" podStartSLOduration=2.792676365 podStartE2EDuration="2.792676365s" podCreationTimestamp="2026-03-07 21:17:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:17:51.788556865 +0000 UTC m=+205.340883787" watchObservedRunningTime="2026-03-07 21:17:51.792676365 +0000 UTC m=+205.345003297" Mar 07 21:17:52.079671 master-0 kubenswrapper[7689]: I0307 21:17:52.079531 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:52.079671 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:52.079671 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:52.079671 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:52.080472 master-0 kubenswrapper[7689]: I0307 21:17:52.080438 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:52.230555 master-0 kubenswrapper[7689]: I0307 21:17:52.230494 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-qnhrz_5b339e6a-cae6-416a-963b-2fd23cecba96/kube-apiserver-operator/0.log" Mar 07 21:17:52.545863 master-0 kubenswrapper[7689]: I0307 21:17:52.545608 7689 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" podStartSLOduration=134.178311645 podStartE2EDuration="2m15.545558123s" podCreationTimestamp="2026-03-07 21:15:37 +0000 UTC" firstStartedPulling="2026-03-07 21:17:48.926777384 +0000 UTC m=+202.479104306" lastFinishedPulling="2026-03-07 21:17:50.294023882 +0000 UTC m=+203.846350784" observedRunningTime="2026-03-07 21:17:52.54338531 +0000 UTC m=+206.095712292" watchObservedRunningTime="2026-03-07 21:17:52.545558123 +0000 UTC m=+206.097885045" Mar 07 21:17:52.552795 master-0 kubenswrapper[7689]: I0307 21:17:52.552712 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver-operator_kube-apiserver-operator-68bd585b-qnhrz_5b339e6a-cae6-416a-963b-2fd23cecba96/kube-apiserver-operator/1.log" Mar 07 21:17:52.571749 master-0 kubenswrapper[7689]: I0307 21:17:52.571631 7689 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 07 21:17:52.572232 master-0 kubenswrapper[7689]: I0307 21:17:52.572163 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" containerID="cri-o://307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c" gracePeriod=15 Mar 07 21:17:52.572322 master-0 kubenswrapper[7689]: I0307 21:17:52.572230 7689 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea" gracePeriod=15 Mar 07 21:17:52.575440 master-0 kubenswrapper[7689]: I0307 21:17:52.575372 7689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:17:52.575836 master-0 kubenswrapper[7689]: E0307 21:17:52.575787 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 07 21:17:52.575836 master-0 kubenswrapper[7689]: I0307 21:17:52.575824 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 07 21:17:52.575987 master-0 kubenswrapper[7689]: E0307 21:17:52.575856 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 07 21:17:52.575987 master-0 kubenswrapper[7689]: I0307 21:17:52.575871 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 07 21:17:52.575987 master-0 kubenswrapper[7689]: E0307 21:17:52.575898 7689 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 07 21:17:52.575987 master-0 kubenswrapper[7689]: I0307 21:17:52.575911 7689 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 07 21:17:52.576221 master-0 kubenswrapper[7689]: I0307 21:17:52.576121 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver" Mar 07 21:17:52.576221 master-0 kubenswrapper[7689]: I0307 21:17:52.576150 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="kube-apiserver-insecure-readyz" Mar 07 21:17:52.576221 master-0 kubenswrapper[7689]: I0307 21:17:52.576167 7689 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f77c8e18b751d90bc0dfe2d4e304050" containerName="setup" Mar 07 21:17:52.578160 master-0 kubenswrapper[7689]: I0307 21:17:52.578108 7689 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:17:52.578438 master-0 kubenswrapper[7689]: I0307 21:17:52.578374 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.579003 master-0 kubenswrapper[7689]: I0307 21:17:52.578952 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.666723 master-0 kubenswrapper[7689]: I0307 21:17:52.666617 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.667110 master-0 kubenswrapper[7689]: I0307 21:17:52.666793 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.667110 master-0 kubenswrapper[7689]: I0307 21:17:52.666940 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.667110 master-0 kubenswrapper[7689]: I0307 21:17:52.667064 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.667332 master-0 kubenswrapper[7689]: I0307 21:17:52.667151 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.667460 master-0 kubenswrapper[7689]: I0307 21:17:52.667396 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.667542 master-0 kubenswrapper[7689]: I0307 21:17:52.667488 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.667606 master-0 kubenswrapper[7689]: I0307 21:17:52.667540 7689 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.768987 master-0 kubenswrapper[7689]: I0307 21:17:52.768874 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.768987 master-0 kubenswrapper[7689]: I0307 21:17:52.768971 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.768987 master-0 kubenswrapper[7689]: I0307 21:17:52.768995 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769418 master-0 kubenswrapper[7689]: I0307 21:17:52.769105 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769418 master-0 kubenswrapper[7689]: I0307 21:17:52.769241 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.769418 master-0 kubenswrapper[7689]: I0307 21:17:52.769308 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769418 master-0 kubenswrapper[7689]: I0307 21:17:52.769263 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769418 master-0 kubenswrapper[7689]: I0307 21:17:52.769366 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.769808 master-0 kubenswrapper[7689]: I0307 21:17:52.769537 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.769808 master-0 kubenswrapper[7689]: I0307 21:17:52.769582 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769808 master-0 kubenswrapper[7689]: I0307 21:17:52.769741 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.769808 master-0 kubenswrapper[7689]: I0307 21:17:52.769798 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.770083 master-0 kubenswrapper[7689]: I0307 21:17:52.769844 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.770083 master-0 kubenswrapper[7689]: I0307 21:17:52.769907 7689 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.770083 master-0 kubenswrapper[7689]: I0307 21:17:52.769970 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.770083 master-0 kubenswrapper[7689]: I0307 21:17:52.769997 7689 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.988468 master-0 kubenswrapper[7689]: I0307 21:17:52.980192 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:52.988468 master-0 kubenswrapper[7689]: I0307 21:17:52.981555 7689 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:52.989893 master-0 kubenswrapper[7689]: I0307 21:17:52.989846 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:17:53.005168 master-0 kubenswrapper[7689]: I0307 21:17:53.002796 7689 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:17:53.073078 master-0 kubenswrapper[7689]: W0307 21:17:53.073000 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdcecc61ff5eeb08bd2a3ac12599e4f9.slice/crio-f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc WatchSource:0}: Error finding container f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc: Status 404 returned error can't find the container with id f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc Mar 07 21:17:53.073619 master-0 kubenswrapper[7689]: W0307 21:17:53.073552 7689 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf417e14665db2ffffa887ce21c9ff0ed.slice/crio-f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174 WatchSource:0}: Error finding container f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174: Status 404 returned error can't find the container with id f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174 Mar 07 21:17:53.079046 master-0 kubenswrapper[7689]: I0307 21:17:53.078992 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:53.079046 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:53.079046 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:53.079046 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:53.079306 master-0 kubenswrapper[7689]: I0307 21:17:53.079044 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:53.310319 master-0 kubenswrapper[7689]: I0307 21:17:53.307671 7689 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_34e734b7-82d6-493d-ace8-1945b2c08c6d/installer/0.log" Mar 07 21:17:53.393185 master-0 kubenswrapper[7689]: I0307 21:17:53.393120 7689 generic.go:334] "Generic (PLEG): container finished" podID="2357c135-5d09-4657-9038-48d25ed55b2d" containerID="c99ad91f1912453e3999a78e354c969699bc344538ab4adcf769bc12a98842c2" exitCode=0 Mar 07 21:17:53.393376 master-0 kubenswrapper[7689]: I0307 21:17:53.393199 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"2357c135-5d09-4657-9038-48d25ed55b2d","Type":"ContainerDied","Data":"c99ad91f1912453e3999a78e354c969699bc344538ab4adcf769bc12a98842c2"} Mar 07 21:17:53.394656 master-0 kubenswrapper[7689]: I0307 21:17:53.394203 7689 status_manager.go:851] "Failed to get status for pod" podUID="2357c135-5d09-4657-9038-48d25ed55b2d" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:53.395024 master-0 kubenswrapper[7689]: I0307 21:17:53.394655 7689 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:53.395877 master-0 kubenswrapper[7689]: I0307 21:17:53.395761 7689 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:53.396197 master-0 kubenswrapper[7689]: I0307 21:17:53.396162 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174"} Mar 07 21:17:53.398046 master-0 kubenswrapper[7689]: I0307 21:17:53.397995 7689 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea" exitCode=0 Mar 07 21:17:53.401893 master-0 kubenswrapper[7689]: I0307 21:17:53.401826 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612"} Mar 07 21:17:53.401893 master-0 kubenswrapper[7689]: I0307 21:17:53.401891 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc"} Mar 07 21:17:53.405493 master-0 kubenswrapper[7689]: I0307 21:17:53.404895 7689 status_manager.go:851] "Failed to get status for pod" podUID="2357c135-5d09-4657-9038-48d25ed55b2d" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-1-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:53.405627 master-0 kubenswrapper[7689]: I0307 21:17:53.405579 7689 status_manager.go:851] "Failed to get status for pod" podUID="f417e14665db2ffffa887ce21c9ff0ed" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:53.406911 master-0 kubenswrapper[7689]: I0307 21:17:53.406820 7689 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:17:54.079665 master-0 kubenswrapper[7689]: I0307 21:17:54.079557 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:54.079665 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:54.079665 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:54.079665 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:54.079665 master-0 kubenswrapper[7689]: I0307 21:17:54.079643 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:54.411163 master-0 kubenswrapper[7689]: I0307 21:17:54.411096 7689 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612" exitCode=0 Mar 07 21:17:54.412356 master-0 kubenswrapper[7689]: I0307 21:17:54.411264 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612"} Mar 07 21:17:54.412356 master-0 kubenswrapper[7689]: I0307 21:17:54.411305 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425"} Mar 07 21:17:54.412356 master-0 kubenswrapper[7689]: I0307 21:17:54.411320 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5"} Mar 07 21:17:54.414193 master-0 kubenswrapper[7689]: I0307 21:17:54.414159 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"d24d032319a9f87acbbf34deb36cb14122c07e93e1e3dd0d42d28beaf572ecc6"} Mar 07 21:17:54.807186 master-0 kubenswrapper[7689]: I0307 21:17:54.807115 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:54.910561 master-0 kubenswrapper[7689]: I0307 21:17:54.910474 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:17:54.910797 master-0 kubenswrapper[7689]: I0307 21:17:54.910613 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:17:54.910797 master-0 kubenswrapper[7689]: I0307 21:17:54.910734 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:17:54.911103 master-0 kubenswrapper[7689]: I0307 21:17:54.911067 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock" (OuterVolumeSpecName: "var-lock") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:54.911155 master-0 kubenswrapper[7689]: I0307 21:17:54.911115 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:54.915584 master-0 kubenswrapper[7689]: I0307 21:17:54.915505 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:17:55.008423 master-0 kubenswrapper[7689]: I0307 21:17:55.003138 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:55.012753 master-0 kubenswrapper[7689]: I0307 21:17:55.012650 7689 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:55.012753 master-0 kubenswrapper[7689]: I0307 21:17:55.012711 7689 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:55.012753 master-0 kubenswrapper[7689]: I0307 21:17:55.012722 7689 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:55.079397 master-0 kubenswrapper[7689]: I0307 21:17:55.079297 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:55.079397 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:55.079397 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:55.079397 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:55.079767 master-0 kubenswrapper[7689]: I0307 21:17:55.079430 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:55.144732 master-0 kubenswrapper[7689]: I0307 21:17:55.144656 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:55.166184 master-0 kubenswrapper[7689]: I0307 21:17:55.165025 7689 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:55.230232 master-0 kubenswrapper[7689]: I0307 21:17:55.230075 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:55.486791 master-0 kubenswrapper[7689]: I0307 21:17:55.483571 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166"} Mar 07 21:17:55.486791 master-0 kubenswrapper[7689]: I0307 21:17:55.483774 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4"} Mar 07 21:17:55.496983 master-0 kubenswrapper[7689]: I0307 21:17:55.493243 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:55.507046 master-0 kubenswrapper[7689]: I0307 21:17:55.502852 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" event={"ID":"2357c135-5d09-4657-9038-48d25ed55b2d","Type":"ContainerDied","Data":"ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26"} Mar 07 21:17:55.507046 master-0 kubenswrapper[7689]: I0307 21:17:55.502919 7689 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26" Mar 07 21:17:56.076269 master-0 kubenswrapper[7689]: I0307 21:17:56.076099 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:56.078815 master-0 kubenswrapper[7689]: I0307 21:17:56.078769 7689 patch_prober.go:28] interesting pod/router-default-79f8cd6fdd-858hg container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 07 21:17:56.078815 master-0 kubenswrapper[7689]: [-]has-synced failed: reason withheld Mar 07 21:17:56.078815 master-0 kubenswrapper[7689]: [+]process-running ok Mar 07 21:17:56.078815 master-0 kubenswrapper[7689]: healthz check failed Mar 07 21:17:56.079000 master-0 kubenswrapper[7689]: I0307 21:17:56.078818 7689 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" podUID="d50f92ea-1c78-4535-a14c-96b00f2cf377" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 07 21:17:56.395082 master-0 kubenswrapper[7689]: I0307 21:17:56.393923 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:17:56.501434 master-0 kubenswrapper[7689]: I0307 21:17:56.501370 7689 generic.go:334] "Generic (PLEG): container finished" podID="5f77c8e18b751d90bc0dfe2d4e304050" containerID="307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c" exitCode=0 Mar 07 21:17:56.501958 master-0 kubenswrapper[7689]: I0307 21:17:56.501461 7689 scope.go:117] "RemoveContainer" containerID="703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea" Mar 07 21:17:56.501958 master-0 kubenswrapper[7689]: I0307 21:17:56.501581 7689 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" Mar 07 21:17:56.508312 master-0 kubenswrapper[7689]: I0307 21:17:56.508266 7689 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d"} Mar 07 21:17:56.508579 master-0 kubenswrapper[7689]: I0307 21:17:56.508530 7689 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:56.516912 master-0 kubenswrapper[7689]: I0307 21:17:56.516866 7689 scope.go:117] "RemoveContainer" containerID="307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c" Mar 07 21:17:56.542058 master-0 kubenswrapper[7689]: I0307 21:17:56.541903 7689 scope.go:117] "RemoveContainer" containerID="d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71" Mar 07 21:17:56.557097 master-0 kubenswrapper[7689]: I0307 21:17:56.557022 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557227 master-0 kubenswrapper[7689]: I0307 21:17:56.557169 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557227 master-0 kubenswrapper[7689]: I0307 21:17:56.557219 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557355 master-0 kubenswrapper[7689]: I0307 21:17:56.557321 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557412 master-0 kubenswrapper[7689]: I0307 21:17:56.557389 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557464 master-0 kubenswrapper[7689]: I0307 21:17:56.557440 7689 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") pod \"5f77c8e18b751d90bc0dfe2d4e304050\" (UID: \"5f77c8e18b751d90bc0dfe2d4e304050\") " Mar 07 21:17:56.557781 master-0 kubenswrapper[7689]: I0307 21:17:56.557735 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets" (OuterVolumeSpecName: "secrets") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.557781 master-0 kubenswrapper[7689]: I0307 21:17:56.557728 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs" (OuterVolumeSpecName: "logs") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.557886 master-0 kubenswrapper[7689]: I0307 21:17:56.557788 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.557886 master-0 kubenswrapper[7689]: I0307 21:17:56.557797 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config" (OuterVolumeSpecName: "config") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.557886 master-0 kubenswrapper[7689]: I0307 21:17:56.557807 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.558001 master-0 kubenswrapper[7689]: I0307 21:17:56.557899 7689 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "5f77c8e18b751d90bc0dfe2d4e304050" (UID: "5f77c8e18b751d90bc0dfe2d4e304050"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:56.558370 master-0 kubenswrapper[7689]: I0307 21:17:56.558334 7689 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.558431 master-0 kubenswrapper[7689]: I0307 21:17:56.558365 7689 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.558431 master-0 kubenswrapper[7689]: I0307 21:17:56.558387 7689 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.558431 master-0 kubenswrapper[7689]: I0307 21:17:56.558401 7689 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.558431 master-0 kubenswrapper[7689]: I0307 21:17:56.558416 7689 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.558431 master-0 kubenswrapper[7689]: I0307 21:17:56.558429 7689 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/5f77c8e18b751d90bc0dfe2d4e304050-secrets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:56.565015 master-0 kubenswrapper[7689]: I0307 21:17:56.564968 7689 scope.go:117] "RemoveContainer" containerID="703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea" Mar 07 21:17:56.565585 master-0 kubenswrapper[7689]: E0307 21:17:56.565552 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea\": container with ID starting with 703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea not found: ID does not exist" containerID="703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea" Mar 07 21:17:56.565654 master-0 kubenswrapper[7689]: I0307 21:17:56.565608 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea"} err="failed to get container status \"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea\": rpc error: code = NotFound desc = could not find container \"703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea\": container with ID starting with 703d1856b68bad1a0bb04e19ce18f4bb31d00ab4490b4cc13327e03ab07841ea not found: ID does not exist" Mar 07 21:17:56.565736 master-0 kubenswrapper[7689]: I0307 21:17:56.565652 7689 scope.go:117] "RemoveContainer" containerID="307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c" Mar 07 21:17:56.566225 master-0 kubenswrapper[7689]: E0307 21:17:56.566190 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c\": container with ID starting with 307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c not found: ID does not exist" containerID="307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c" Mar 07 21:17:56.566225 master-0 kubenswrapper[7689]: I0307 21:17:56.566216 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c"} err="failed to get container status \"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c\": rpc error: code = NotFound desc = could not find container \"307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c\": container with ID starting with 307f2271166e46af834d6e8be9e0b32f347ff55144221bcc2537d4617133950c not found: ID does not exist" Mar 07 21:17:56.566334 master-0 kubenswrapper[7689]: I0307 21:17:56.566235 7689 scope.go:117] "RemoveContainer" containerID="d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71" Mar 07 21:17:56.566691 master-0 kubenswrapper[7689]: E0307 21:17:56.566622 7689 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71\": container with ID starting with d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71 not found: ID does not exist" containerID="d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71" Mar 07 21:17:56.566761 master-0 kubenswrapper[7689]: I0307 21:17:56.566666 7689 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71"} err="failed to get container status \"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71\": rpc error: code = NotFound desc = could not find container \"d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71\": container with ID starting with d8612f18e8048674d0b5b632e5e2ee3f75601d2ef6fdf9595cfa75fd94faec71 not found: ID does not exist" Mar 07 21:17:56.692883 master-0 kubenswrapper[7689]: I0307 21:17:56.692812 7689 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f77c8e18b751d90bc0dfe2d4e304050" path="/var/lib/kubelet/pods/5f77c8e18b751d90bc0dfe2d4e304050/volumes" Mar 07 21:17:56.693259 master-0 kubenswrapper[7689]: I0307 21:17:56.693227 7689 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 07 21:17:56.815631 master-0 systemd[1]: Stopping Kubernetes Kubelet... Mar 07 21:17:56.836526 master-0 systemd[1]: kubelet.service: Deactivated successfully. Mar 07 21:17:56.836912 master-0 systemd[1]: Stopped Kubernetes Kubelet. Mar 07 21:17:56.838909 master-0 systemd[1]: kubelet.service: Consumed 34.709s CPU time. Mar 07 21:17:56.860220 master-0 systemd[1]: Starting Kubernetes Kubelet... Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 07 21:17:57.002569 master-0 kubenswrapper[16352]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 07 21:17:57.003592 master-0 kubenswrapper[16352]: I0307 21:17:57.002555 16352 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009744 16352 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009785 16352 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009794 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009800 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009805 16352 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009811 16352 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:17:57.009804 master-0 kubenswrapper[16352]: W0307 21:17:57.009819 16352 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009828 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009834 16352 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009840 16352 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009845 16352 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009850 16352 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009855 16352 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009860 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009866 16352 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009871 16352 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009877 16352 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009885 16352 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009891 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009898 16352 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009909 16352 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009917 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009923 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009933 16352 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009942 16352 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009949 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:17:57.010214 master-0 kubenswrapper[16352]: W0307 21:17:57.009956 16352 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009963 16352 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009970 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009976 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009983 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009989 16352 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009994 16352 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.009999 16352 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010007 16352 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010013 16352 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010018 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010023 16352 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010029 16352 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010034 16352 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010039 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010044 16352 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010049 16352 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010054 16352 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010060 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010065 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:17:57.011323 master-0 kubenswrapper[16352]: W0307 21:17:57.010070 16352 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010075 16352 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010080 16352 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010084 16352 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010090 16352 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010095 16352 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010100 16352 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010105 16352 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010110 16352 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010115 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010120 16352 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010124 16352 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010129 16352 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010134 16352 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010141 16352 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010147 16352 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010152 16352 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010157 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010162 16352 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010167 16352 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010172 16352 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:17:57.012232 master-0 kubenswrapper[16352]: W0307 21:17:57.010177 16352 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: W0307 21:17:57.010182 16352 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: W0307 21:17:57.010187 16352 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: W0307 21:17:57.010202 16352 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: W0307 21:17:57.010207 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010337 16352 flags.go:64] FLAG: --address="0.0.0.0" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010401 16352 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010414 16352 flags.go:64] FLAG: --anonymous-auth="true" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010422 16352 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010430 16352 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010436 16352 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010444 16352 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010451 16352 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010458 16352 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010464 16352 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010470 16352 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010477 16352 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010484 16352 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010490 16352 flags.go:64] FLAG: --cgroup-root="" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010495 16352 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010501 16352 flags.go:64] FLAG: --client-ca-file="" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010507 16352 flags.go:64] FLAG: --cloud-config="" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010516 16352 flags.go:64] FLAG: --cloud-provider="" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010524 16352 flags.go:64] FLAG: --cluster-dns="[]" Mar 07 21:17:57.013589 master-0 kubenswrapper[16352]: I0307 21:17:57.010536 16352 flags.go:64] FLAG: --cluster-domain="" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010543 16352 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010551 16352 flags.go:64] FLAG: --config-dir="" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010558 16352 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010565 16352 flags.go:64] FLAG: --container-log-max-files="5" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010576 16352 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010584 16352 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010591 16352 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010598 16352 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010606 16352 flags.go:64] FLAG: --contention-profiling="false" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010612 16352 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010624 16352 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010631 16352 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010638 16352 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010647 16352 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010655 16352 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010662 16352 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010669 16352 flags.go:64] FLAG: --enable-load-reader="false" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010697 16352 flags.go:64] FLAG: --enable-server="true" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010705 16352 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010715 16352 flags.go:64] FLAG: --event-burst="100" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010722 16352 flags.go:64] FLAG: --event-qps="50" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010728 16352 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010734 16352 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010740 16352 flags.go:64] FLAG: --eviction-hard="" Mar 07 21:17:57.014527 master-0 kubenswrapper[16352]: I0307 21:17:57.010749 16352 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010757 16352 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010765 16352 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010774 16352 flags.go:64] FLAG: --eviction-soft="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010781 16352 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010788 16352 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010796 16352 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010804 16352 flags.go:64] FLAG: --experimental-mounter-path="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010811 16352 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010818 16352 flags.go:64] FLAG: --fail-swap-on="true" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010825 16352 flags.go:64] FLAG: --feature-gates="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010834 16352 flags.go:64] FLAG: --file-check-frequency="20s" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010842 16352 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010849 16352 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010856 16352 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010864 16352 flags.go:64] FLAG: --healthz-port="10248" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010907 16352 flags.go:64] FLAG: --help="false" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010916 16352 flags.go:64] FLAG: --hostname-override="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010927 16352 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010935 16352 flags.go:64] FLAG: --http-check-frequency="20s" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010943 16352 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010950 16352 flags.go:64] FLAG: --image-credential-provider-config="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010957 16352 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010964 16352 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010972 16352 flags.go:64] FLAG: --image-service-endpoint="" Mar 07 21:17:57.015552 master-0 kubenswrapper[16352]: I0307 21:17:57.010980 16352 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.010988 16352 flags.go:64] FLAG: --kube-api-burst="100" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.010995 16352 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011003 16352 flags.go:64] FLAG: --kube-api-qps="50" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011010 16352 flags.go:64] FLAG: --kube-reserved="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011018 16352 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011025 16352 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011033 16352 flags.go:64] FLAG: --kubelet-cgroups="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011040 16352 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011047 16352 flags.go:64] FLAG: --lock-file="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011065 16352 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011073 16352 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011082 16352 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011099 16352 flags.go:64] FLAG: --log-json-split-stream="false" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011107 16352 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011114 16352 flags.go:64] FLAG: --log-text-split-stream="false" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011121 16352 flags.go:64] FLAG: --logging-format="text" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011129 16352 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011136 16352 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011144 16352 flags.go:64] FLAG: --manifest-url="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011151 16352 flags.go:64] FLAG: --manifest-url-header="" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011163 16352 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011170 16352 flags.go:64] FLAG: --max-open-files="1000000" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011180 16352 flags.go:64] FLAG: --max-pods="110" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011188 16352 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 07 21:17:57.016479 master-0 kubenswrapper[16352]: I0307 21:17:57.011199 16352 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011207 16352 flags.go:64] FLAG: --memory-manager-policy="None" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011215 16352 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011224 16352 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011232 16352 flags.go:64] FLAG: --node-ip="192.168.32.10" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011239 16352 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011307 16352 flags.go:64] FLAG: --node-status-max-images="50" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011316 16352 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011325 16352 flags.go:64] FLAG: --oom-score-adj="-999" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011334 16352 flags.go:64] FLAG: --pod-cidr="" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011342 16352 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1d605384f31a8085f78a96145c2c3dc51afe22721144196140a2699b7c07ebe3" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011356 16352 flags.go:64] FLAG: --pod-manifest-path="" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011364 16352 flags.go:64] FLAG: --pod-max-pids="-1" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011372 16352 flags.go:64] FLAG: --pods-per-core="0" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011379 16352 flags.go:64] FLAG: --port="10250" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011387 16352 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011394 16352 flags.go:64] FLAG: --provider-id="" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011402 16352 flags.go:64] FLAG: --qos-reserved="" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011413 16352 flags.go:64] FLAG: --read-only-port="10255" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011420 16352 flags.go:64] FLAG: --register-node="true" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011427 16352 flags.go:64] FLAG: --register-schedulable="true" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011434 16352 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011449 16352 flags.go:64] FLAG: --registry-burst="10" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011456 16352 flags.go:64] FLAG: --registry-qps="5" Mar 07 21:17:57.017524 master-0 kubenswrapper[16352]: I0307 21:17:57.011465 16352 flags.go:64] FLAG: --reserved-cpus="" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011474 16352 flags.go:64] FLAG: --reserved-memory="" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011484 16352 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011492 16352 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011501 16352 flags.go:64] FLAG: --rotate-certificates="false" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011584 16352 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011593 16352 flags.go:64] FLAG: --runonce="false" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011600 16352 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011612 16352 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011620 16352 flags.go:64] FLAG: --seccomp-default="false" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011628 16352 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011635 16352 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011643 16352 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011652 16352 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011659 16352 flags.go:64] FLAG: --storage-driver-password="root" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011667 16352 flags.go:64] FLAG: --storage-driver-secure="false" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011699 16352 flags.go:64] FLAG: --storage-driver-table="stats" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011709 16352 flags.go:64] FLAG: --storage-driver-user="root" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011717 16352 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011725 16352 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011733 16352 flags.go:64] FLAG: --system-cgroups="" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011741 16352 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011754 16352 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011761 16352 flags.go:64] FLAG: --tls-cert-file="" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011769 16352 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 07 21:17:57.018423 master-0 kubenswrapper[16352]: I0307 21:17:57.011780 16352 flags.go:64] FLAG: --tls-min-version="" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011792 16352 flags.go:64] FLAG: --tls-private-key-file="" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011799 16352 flags.go:64] FLAG: --topology-manager-policy="none" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011806 16352 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011813 16352 flags.go:64] FLAG: --topology-manager-scope="container" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011821 16352 flags.go:64] FLAG: --v="2" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011832 16352 flags.go:64] FLAG: --version="false" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011841 16352 flags.go:64] FLAG: --vmodule="" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011850 16352 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: I0307 21:17:57.011858 16352 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012054 16352 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012066 16352 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012073 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012080 16352 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012086 16352 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012097 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012105 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012113 16352 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012121 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012128 16352 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012134 16352 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012141 16352 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:17:57.019655 master-0 kubenswrapper[16352]: W0307 21:17:57.012147 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012380 16352 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012397 16352 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012405 16352 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012412 16352 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012419 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012425 16352 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012433 16352 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012439 16352 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012446 16352 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012453 16352 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012463 16352 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012470 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012476 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012483 16352 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012490 16352 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012496 16352 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012502 16352 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012509 16352 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012519 16352 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:17:57.020457 master-0 kubenswrapper[16352]: W0307 21:17:57.012527 16352 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012535 16352 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012541 16352 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012548 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012555 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012568 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012575 16352 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012584 16352 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012592 16352 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012600 16352 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012607 16352 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012615 16352 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012621 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012732 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012741 16352 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012748 16352 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012754 16352 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012761 16352 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012770 16352 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:17:57.021231 master-0 kubenswrapper[16352]: W0307 21:17:57.012778 16352 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012785 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012793 16352 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012799 16352 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012810 16352 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012817 16352 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012824 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012830 16352 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012837 16352 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012845 16352 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012851 16352 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012858 16352 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012865 16352 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012872 16352 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012878 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012884 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012891 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012898 16352 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012907 16352 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012914 16352 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:17:57.022105 master-0 kubenswrapper[16352]: W0307 21:17:57.012920 16352 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: I0307 21:17:57.012943 16352 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: I0307 21:17:57.019936 16352 server.go:491] "Kubelet version" kubeletVersion="v1.31.14" Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: I0307 21:17:57.019961 16352 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020057 16352 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020066 16352 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020074 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020081 16352 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020087 16352 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020093 16352 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020101 16352 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020107 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020114 16352 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020120 16352 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:17:57.022831 master-0 kubenswrapper[16352]: W0307 21:17:57.020125 16352 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020130 16352 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020136 16352 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020141 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020145 16352 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020151 16352 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020156 16352 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020161 16352 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020165 16352 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020171 16352 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020176 16352 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020181 16352 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020186 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020192 16352 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020197 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020202 16352 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020208 16352 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020214 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020219 16352 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020224 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:17:57.023350 master-0 kubenswrapper[16352]: W0307 21:17:57.020229 16352 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020234 16352 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020240 16352 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020245 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020250 16352 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020255 16352 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020260 16352 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020265 16352 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020270 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020276 16352 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020281 16352 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020286 16352 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020291 16352 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020296 16352 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020301 16352 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020306 16352 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020311 16352 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020317 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020322 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020326 16352 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:17:57.024174 master-0 kubenswrapper[16352]: W0307 21:17:57.020331 16352 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020337 16352 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020344 16352 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020350 16352 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020355 16352 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020360 16352 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020367 16352 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020373 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020378 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020384 16352 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020389 16352 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020395 16352 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020400 16352 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020406 16352 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020412 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020417 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020423 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020429 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020434 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:17:57.024904 master-0 kubenswrapper[16352]: W0307 21:17:57.020439 16352 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020445 16352 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020451 16352 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: I0307 21:17:57.020460 16352 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020628 16352 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020636 16352 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020643 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020649 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020654 16352 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020660 16352 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020665 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020670 16352 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020676 16352 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020702 16352 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020710 16352 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 07 21:17:57.025615 master-0 kubenswrapper[16352]: W0307 21:17:57.020716 16352 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020722 16352 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020728 16352 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020734 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020740 16352 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020746 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020751 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020757 16352 feature_gate.go:330] unrecognized feature gate: Example Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020763 16352 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020768 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020774 16352 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020779 16352 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020784 16352 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020791 16352 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020797 16352 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020805 16352 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020811 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020816 16352 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020822 16352 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 07 21:17:57.027433 master-0 kubenswrapper[16352]: W0307 21:17:57.020828 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020834 16352 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020839 16352 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020844 16352 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020850 16352 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020855 16352 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020860 16352 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020866 16352 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020871 16352 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020876 16352 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020881 16352 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020887 16352 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020892 16352 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020897 16352 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020902 16352 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020907 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020912 16352 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020917 16352 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020924 16352 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020930 16352 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 07 21:17:57.028368 master-0 kubenswrapper[16352]: W0307 21:17:57.020935 16352 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020940 16352 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020945 16352 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020952 16352 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020958 16352 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020964 16352 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020969 16352 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020974 16352 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020979 16352 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020985 16352 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020990 16352 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.020995 16352 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021003 16352 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021008 16352 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021014 16352 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021019 16352 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021025 16352 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021031 16352 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021037 16352 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 07 21:17:57.029286 master-0 kubenswrapper[16352]: W0307 21:17:57.021044 16352 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: W0307 21:17:57.021050 16352 feature_gate.go:330] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: W0307 21:17:57.021055 16352 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.021063 16352 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false StreamingCollectionEncodingToJSON:true StreamingCollectionEncodingToProtobuf:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.021328 16352 server.go:940] "Client rotation is on, will bootstrap in background" Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.023415 16352 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.023522 16352 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.023828 16352 server.go:997] "Starting client certificate rotation" Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.023843 16352 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.024007 16352 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 14:46:45.394158496 +0000 UTC Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.024127 16352 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 17h28m48.370034366s for next certificate rotation Mar 07 21:17:57.030375 master-0 kubenswrapper[16352]: I0307 21:17:57.024638 16352 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:17:57.031256 master-0 kubenswrapper[16352]: I0307 21:17:57.027050 16352 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 07 21:17:57.031256 master-0 kubenswrapper[16352]: I0307 21:17:57.031087 16352 log.go:25] "Validated CRI v1 runtime API" Mar 07 21:17:57.035664 master-0 kubenswrapper[16352]: I0307 21:17:57.035631 16352 log.go:25] "Validated CRI v1 image API" Mar 07 21:17:57.036963 master-0 kubenswrapper[16352]: I0307 21:17:57.036887 16352 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 07 21:17:57.052024 master-0 kubenswrapper[16352]: I0307 21:17:57.051940 16352 fs.go:135] Filesystem UUIDs: map[424f727a-1c86-4a89-859c-7d0acaca7766:/dev/vda3 7B77-95E7:/dev/vda2 910678ff-f77e-4a7d-8d53-86f2ac47a823:/dev/vda4] Mar 07 21:17:57.053758 master-0 kubenswrapper[16352]: I0307 21:17:57.052010 16352 fs.go:136] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm major:0 minor:268 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm major:0 minor:261 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/0d6ce5c921f4c23cf75893970c3672194512ea3e6e2c3df0b77494942ff24a81/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/0d6ce5c921f4c23cf75893970c3672194512ea3e6e2c3df0b77494942ff24a81/userdata/shm major:0 minor:1014 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/112a83bbfd7da68fd7d98c9912932beebde7c37fe463c6524a512ede7b50dc89/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/112a83bbfd7da68fd7d98c9912932beebde7c37fe463c6524a512ede7b50dc89/userdata/shm major:0 minor:883 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd/userdata/shm major:0 minor:1029 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/1ca4880cca3c21e3d7b1cda1ce4ee79b5948da96d4adeaa90fb0268e490efa53/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/1ca4880cca3c21e3d7b1cda1ce4ee79b5948da96d4adeaa90fb0268e490efa53/userdata/shm major:0 minor:1016 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496/userdata/shm major:0 minor:808 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017/userdata/shm major:0 minor:477 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0/userdata/shm major:0 minor:309 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/2a4e91956e6af4d37253ed844488126f5600b96517ef3a0ce7d67e4b637437bf/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/2a4e91956e6af4d37253ed844488126f5600b96517ef3a0ce7d67e4b637437bf/userdata/shm major:0 minor:436 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663/userdata/shm major:0 minor:965 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16/userdata/shm major:0 minor:552 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb/userdata/shm major:0 minor:790 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm major:0 minor:244 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba/userdata/shm major:0 minor:487 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f/userdata/shm major:0 minor:628 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877/userdata/shm major:0 minor:438 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16/userdata/shm major:0 minor:154 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25/userdata/shm major:0 minor:667 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697/userdata/shm major:0 minor:490 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5567f1923dad84459fcc9068a666c7d7b21e33dc4f847dbb0c61779518830669/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5567f1923dad84459fcc9068a666c7d7b21e33dc4f847dbb0c61779518830669/userdata/shm major:0 minor:811 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5731d94226d26524a88cd0e1f020f55306937afa54c19184462a51a135d32f71/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5731d94226d26524a88cd0e1f020f55306937afa54c19184462a51a135d32f71/userdata/shm major:0 minor:644 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144/userdata/shm major:0 minor:788 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4/userdata/shm major:0 minor:670 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/59fb206093956750cd2b0971ba9daf6182e197e8af3331245cd46cb229bb1de1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/59fb206093956750cd2b0971ba9daf6182e197e8af3331245cd46cb229bb1de1/userdata/shm major:0 minor:746 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/5ba8d02efd97ab96c66d2e5a8c58f04777b536ec1ff43d8a222b2f0642623996/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/5ba8d02efd97ab96c66d2e5a8c58f04777b536ec1ff43d8a222b2f0642623996/userdata/shm major:0 minor:886 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe/userdata/shm major:0 minor:800 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm major:0 minor:129 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942/userdata/shm major:0 minor:486 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm major:0 minor:104 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee/userdata/shm major:0 minor:668 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1/userdata/shm major:0 minor:525 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187/userdata/shm major:0 minor:350 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm major:0 minor:276 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9/userdata/shm major:0 minor:994 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/88596b62ed73d1cc0a657006e38bdd5646ef2e8ca1da1e67945f77115c8e4249/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/88596b62ed73d1cc0a657006e38bdd5646ef2e8ca1da1e67945f77115c8e4249/userdata/shm major:0 minor:660 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm major:0 minor:54 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm major:0 minor:238 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm major:0 minor:272 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm major:0 minor:242 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/970d4806b55e4555ffff42e4b3c89ee95e0a6b585519742e791fd49bb6cf6a08/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/970d4806b55e4555ffff42e4b3c89ee95e0a6b585519742e791fd49bb6cf6a08/userdata/shm major:0 minor:493 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9a3242defcab78a5704c3ac516165c6355f42a0842d58543e6938dbfa54c0dc4/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9a3242defcab78a5704c3ac516165c6355f42a0842d58543e6938dbfa54c0dc4/userdata/shm major:0 minor:748 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm major:0 minor:119 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab/userdata/shm major:0 minor:818 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689/userdata/shm major:0 minor:626 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm major:0 minor:44 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376/userdata/shm major:0 minor:812 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473/userdata/shm major:0 minor:334 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm major:0 minor:148 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/a976805a261b43c3cbc596829459288339bb9f57afae203909e8153931024f4e/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/a976805a261b43c3cbc596829459288339bb9f57afae203909e8153931024f4e/userdata/shm major:0 minor:1041 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm major:0 minor:258 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm major:0 minor:41 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974/userdata/shm major:0 minor:822 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312/userdata/shm major:0 minor:810 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm major:0 minor:100 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c098327f700751fe6a38c107559ad8b2a80af9c9060aa16b67a2b7a48e44faad/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c098327f700751fe6a38c107559ad8b2a80af9c9060aa16b67a2b7a48e44faad/userdata/shm major:0 minor:529 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c43f11af2c8b842060c66a8968b08b62d92a450aa814f560f58b0b7108694635/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c43f11af2c8b842060c66a8968b08b62d92a450aa814f560f58b0b7108694635/userdata/shm major:0 minor:494 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6/userdata/shm major:0 minor:341 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/c7a270720447e0a61bb1c8ec80a8415d28e52795162c44c7229c8de5a130a13d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/c7a270720447e0a61bb1c8ec80a8415d28e52795162c44c7229c8de5a130a13d/userdata/shm major:0 minor:802 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm major:0 minor:255 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/cdde49fab8a3c629c252f1f7390a41b3c48bf77cd72b2434083e80efd11766cc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/cdde49fab8a3c629c252f1f7390a41b3c48bf77cd72b2434083e80efd11766cc/userdata/shm major:0 minor:804 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm major:0 minor:248 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm major:0 minor:128 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm major:0 minor:252 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc/userdata/shm major:0 minor:84 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed/userdata/shm major:0 minor:819 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm major:0 minor:280 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174/userdata/shm major:0 minor:85 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48/userdata/shm major:0 minor:421 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm major:0 minor:262 fsType:tmpfs blockSize:0} /run/containers/storage/overlay-containers/fe67bfc50554c3c039f940d887faf411984b747c8be2377d1eb15383b70de1a2/userdata/shm:{mountpoint:/run/containers/storage/overlay-containers/fe67bfc50554c3c039f940d887faf411984b747c8be2377d1eb15383b70de1a2/userdata/shm major:0 minor:491 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/15270349-f3aa-43bc-88a8-f0fff3aa2528/volumes/kubernetes.io~projected/kube-api-access-qwzgb:{mountpoint:/var/lib/kubelet/pods/15270349-f3aa-43bc-88a8-f0fff3aa2528/volumes/kubernetes.io~projected/kube-api-access-qwzgb major:0 minor:302 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/ca-certs major:0 minor:434 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/kube-api-access-2jcxp:{mountpoint:/var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/kube-api-access-2jcxp major:0 minor:435 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~projected/kube-api-access-rp45l:{mountpoint:/var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~projected/kube-api-access-rp45l major:0 minor:771 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls:{mountpoint:/var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls major:0 minor:760 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~projected/kube-api-access-4ds84:{mountpoint:/var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~projected/kube-api-access-4ds84 major:0 minor:420 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~secret/signing-key:{mountpoint:/var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~secret/signing-key major:0 minor:415 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm:{mountpoint:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm major:0 minor:235 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert major:0 minor:217 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps:{mountpoint:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps major:0 minor:141 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert major:0 minor:140 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/ca-certs:{mountpoint:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/ca-certs major:0 minor:432 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/kube-api-access-zjt7j:{mountpoint:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/kube-api-access-zjt7j major:0 minor:433 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~secret/catalogserver-certs:{mountpoint:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~secret/catalogserver-certs major:0 minor:427 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44:{mountpoint:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44 major:0 minor:246 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert major:0 minor:221 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf:{mountpoint:/var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf major:0 minor:115 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access major:0 minor:230 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert major:0 minor:220 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6 major:0 minor:24 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb major:0 minor:127 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert:{mountpoint:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert major:0 minor:126 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st:{mountpoint:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st major:0 minor:125 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert:{mountpoint:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert major:0 minor:124 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~projected/kube-api-access-drnv4:{mountpoint:/var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~projected/kube-api-access-drnv4 major:0 minor:777 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~secret/machine-api-operator-tls:{mountpoint:/var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~secret/machine-api-operator-tls major:0 minor:729 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:226 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l:{mountpoint:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l major:0 minor:224 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~secret/metrics-tls major:0 minor:485 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~projected/kube-api-access-vmp5q:{mountpoint:/var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~projected/kube-api-access-vmp5q major:0 minor:620 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~secret/metrics-tls major:0 minor:621 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~projected/kube-api-access-k2gv7:{mountpoint:/var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~projected/kube-api-access-k2gv7 major:0 minor:993 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~secret/proxy-tls major:0 minor:989 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5625eb9f-c80b-47b1-b70c-aa636fbc03ac/volumes/kubernetes.io~projected/kube-api-access-khdpn:{mountpoint:/var/lib/kubelet/pods/5625eb9f-c80b-47b1-b70c-aa636fbc03ac/volumes/kubernetes.io~projected/kube-api-access-khdpn major:0 minor:796 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~projected/kube-api-access-6bqlq:{mountpoint:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~projected/kube-api-access-6bqlq major:0 minor:1040 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/certs:{mountpoint:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/certs major:0 minor:1038 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/node-bootstrap-token:{mountpoint:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/node-bootstrap-token major:0 minor:1039 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access major:0 minor:236 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert major:0 minor:218 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh major:0 minor:231 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client major:0 minor:216 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert major:0 minor:222 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb:{mountpoint:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb major:0 minor:257 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~secret/metrics-tls major:0 minor:484 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~projected/kube-api-access-7cz8d:{mountpoint:/var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~projected/kube-api-access-7cz8d major:0 minor:875 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~secret/proxy-tls major:0 minor:871 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz:{mountpoint:/var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz major:0 minor:267 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v:{mountpoint:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v major:0 minor:241 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~secret/srv-cert major:0 minor:472 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6d5765e6-80cc-404b-b375-c109febd1843/volumes/kubernetes.io~projected/kube-api-access-8wps6:{mountpoint:/var/lib/kubelet/pods/6d5765e6-80cc-404b-b375-c109febd1843/volumes/kubernetes.io~projected/kube-api-access-8wps6 major:0 minor:1012 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~projected/kube-api-access-lzr66:{mountpoint:/var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~projected/kube-api-access-lzr66 major:0 minor:680 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~secret/serving-cert major:0 minor:565 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq:{mountpoint:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq major:0 minor:225 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~secret/srv-cert:{mountpoint:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~secret/srv-cert major:0 minor:465 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~projected/kube-api-access-vvzbm:{mountpoint:/var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~projected/kube-api-access-vvzbm major:0 minor:679 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~secret/serving-cert major:0 minor:284 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~projected/kube-api-access-4lbmm:{mountpoint:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~projected/kube-api-access-4lbmm major:0 minor:524 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/encryption-config major:0 minor:523 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/etcd-client major:0 minor:521 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/serving-cert major:0 minor:522 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f65054f-caf3-4cd3-889e-8d5a5376b1b8/volumes/kubernetes.io~projected/kube-api-access-2b28m:{mountpoint:/var/lib/kubelet/pods/7f65054f-caf3-4cd3-889e-8d5a5376b1b8/volumes/kubernetes.io~projected/kube-api-access-2b28m major:0 minor:798 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~projected/kube-api-access-5n27m:{mountpoint:/var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~projected/kube-api-access-5n27m major:0 minor:773 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~secret/proxy-tls:{mountpoint:/var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~secret/proxy-tls major:0 minor:720 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/7fa7b789-9201-493e-a96d-484a2622301a/volumes/kubernetes.io~projected/kube-api-access-5nnk5:{mountpoint:/var/lib/kubelet/pods/7fa7b789-9201-493e-a96d-484a2622301a/volumes/kubernetes.io~projected/kube-api-access-5nnk5 major:0 minor:349 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs:{mountpoint:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs major:0 minor:229 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert major:0 minor:214 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8512a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~projected/kube-api-access-fqtbf:{mountpoint:/var/lib/kubelet/pods/8512a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~projected/kube-api-access-fqtbf major:0 minor:735 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/8512a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/8512a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~secret/serving-cert major:0 minor:730 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~projected/kube-api-access-d9mmg:{mountpoint:/var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~projected/kube-api-access-d9mmg major:0 minor:797 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert major:0 minor:794 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/9515e34b-addf-487a-adf8-c6ef24fcc54c/volumes/kubernetes.io~secret/tls-certificates:{mountpoint:/var/lib/kubelet/pods/9515e34b-addf-487a-adf8-c6ef24fcc54c/volumes/kubernetes.io~secret/tls-certificates major:0 minor:1024 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~projected/kube-api-access major:0 minor:481 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~secret/serving-cert major:0 minor:98 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj:{mountpoint:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj major:0 minor:254 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~secret/webhook-certs:{mountpoint:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~secret/webhook-certs major:0 minor:470 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8:{mountpoint:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8 major:0 minor:265 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cert major:0 minor:482 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls:{mountpoint:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls major:0 minor:480 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p:{mountpoint:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p major:0 minor:247 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls:{mountpoint:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls major:0 minor:466 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk:{mountpoint:/var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk major:0 minor:260 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access:{mountpoint:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access major:0 minor:266 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert major:0 minor:237 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~projected/kube-api-access-f8mm9:{mountpoint:/var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~projected/kube-api-access-f8mm9 major:0 minor:776 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~secret/cert:{mountpoint:/var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~secret/cert major:0 minor:731 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck:{mountpoint:/var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck major:0 minor:99 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr:{mountpoint:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr major:0 minor:240 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert major:0 minor:215 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/etc-tuned:{mountpoint:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/etc-tuned major:0 minor:616 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/tmp:{mountpoint:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/tmp major:0 minor:615 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~projected/kube-api-access-87fml:{mountpoint:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~projected/kube-api-access-87fml major:0 minor:617 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr:{mountpoint:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr major:0 minor:233 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert major:0 minor:213 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~projected/kube-api-access-ktjs9:{mountpoint:/var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~projected/kube-api-access-ktjs9 major:0 minor:799 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert:{mountpoint:/var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert major:0 minor:791 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~projected/kube-api-access-tq99k:{mountpoint:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~projected/kube-api-access-tq99k major:0 minor:882 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:881 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/webhook-cert:{mountpoint:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/webhook-cert major:0 minor:876 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9/volumes/kubernetes.io~projected/kube-api-access-pwj77:{mountpoint:/var/lib/kubelet/pods/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9/volumes/kubernetes.io~projected/kube-api-access-pwj77 major:0 minor:780 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~projected/kube-api-access-9mzlv:{mountpoint:/var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~projected/kube-api-access-9mzlv major:0 minor:795 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~secret/samples-operator-tls:{mountpoint:/var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~secret/samples-operator-tls major:0 minor:792 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca25117a-ccd5- Mar 07 21:17:57.054229 master-0 kubenswrapper[16352]: 4628-8342-e277bb7be0e2/volumes/kubernetes.io~projected/kube-api-access-9kgkz:{mountpoint:/var/lib/kubelet/pods/ca25117a-ccd5-4628-8342-e277bb7be0e2/volumes/kubernetes.io~projected/kube-api-access-9kgkz major:0 minor:964 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ca25117a-ccd5-4628-8342-e277bb7be0e2/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls:{mountpoint:/var/lib/kubelet/pods/ca25117a-ccd5-4628-8342-e277bb7be0e2/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls major:0 minor:960 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~projected/kube-api-access-jpjms:{mountpoint:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~projected/kube-api-access-jpjms major:0 minor:1013 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/default-certificate:{mountpoint:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/default-certificate major:0 minor:1011 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/metrics-certs major:0 minor:1006 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/stats-auth:{mountpoint:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/stats-auth major:0 minor:1010 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6:{mountpoint:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6 major:0 minor:123 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~secret/metrics-certs:{mountpoint:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~secret/metrics-certs major:0 minor:464 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token:{mountpoint:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token major:0 minor:232 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs:{mountpoint:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs major:0 minor:249 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~secret/image-registry-operator-tls:{mountpoint:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~secret/image-registry-operator-tls major:0 minor:479 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e38fc940-e59a-45ff-978b-fdcdc534a2a5/volumes/kubernetes.io~projected/kube-api-access-2zppz:{mountpoint:/var/lib/kubelet/pods/e38fc940-e59a-45ff-978b-fdcdc534a2a5/volumes/kubernetes.io~projected/kube-api-access-2zppz major:0 minor:326 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~projected/kube-api-access-fpck7:{mountpoint:/var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~projected/kube-api-access-fpck7 major:0 minor:734 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~secret/machine-approver-tls:{mountpoint:/var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~secret/machine-approver-tls major:0 minor:733 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm:{mountpoint:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm major:0 minor:234 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert major:0 minor:219 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv:{mountpoint:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv major:0 minor:228 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~secret/package-server-manager-serving-cert:{mountpoint:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~secret/package-server-manager-serving-cert major:0 minor:474 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~projected/kube-api-access-69jxd:{mountpoint:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~projected/kube-api-access-69jxd major:0 minor:473 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/encryption-config:{mountpoint:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/encryption-config major:0 minor:371 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/etcd-client:{mountpoint:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/etcd-client major:0 minor:366 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/serving-cert major:0 minor:518 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f08edf29-c53f-452d-880b-e8ce27b05b6f/volumes/kubernetes.io~projected/kube-api-access-hqxlr:{mountpoint:/var/lib/kubelet/pods/f08edf29-c53f-452d-880b-e8ce27b05b6f/volumes/kubernetes.io~projected/kube-api-access-hqxlr major:0 minor:772 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f2ca65f5-7dbe-4407-b38e-713592f62136/volumes/kubernetes.io~projected/kube-api-access-fs7nz:{mountpoint:/var/lib/kubelet/pods/f2ca65f5-7dbe-4407-b38e-713592f62136/volumes/kubernetes.io~projected/kube-api-access-fs7nz major:0 minor:625 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg:{mountpoint:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg major:0 minor:94 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls:{mountpoint:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls major:0 minor:43 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr:{mountpoint:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr major:0 minor:223 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/apiservice-cert:{mountpoint:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/apiservice-cert major:0 minor:483 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/node-tuning-operator-tls:{mountpoint:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/node-tuning-operator-tls major:0 minor:478 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff:{mountpoint:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff major:0 minor:227 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~secret/marketplace-operator-metrics:{mountpoint:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~secret/marketplace-operator-metrics major:0 minor:471 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw:{mountpoint:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw major:0 minor:251 fsType:tmpfs blockSize:0} /var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert:{mountpoint:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert major:0 minor:209 fsType:tmpfs blockSize:0} overlay_0-1000:{mountpoint:/var/lib/containers/storage/overlay/48c0fdf4e6d9bc005e93b5516f6d1432ea26329b1f3057e3e9978a9f06529a3d/merged major:0 minor:1000 fsType:overlay blockSize:0} overlay_0-1018:{mountpoint:/var/lib/containers/storage/overlay/37b9e77e8bf6afbe8d75c7187d23c5cdf7c05fcc36c8d6cb343b2b78a47ad745/merged major:0 minor:1018 fsType:overlay blockSize:0} overlay_0-102:{mountpoint:/var/lib/containers/storage/overlay/102e4f5d0feaf57a9b6984baf9484000d8cd15c04f217b66325dda777197b743/merged major:0 minor:102 fsType:overlay blockSize:0} overlay_0-1020:{mountpoint:/var/lib/containers/storage/overlay/86f06a9b7bfbd4123d1d85088eb1414e32870b937b8f61d23044f81e98898933/merged major:0 minor:1020 fsType:overlay blockSize:0} overlay_0-1022:{mountpoint:/var/lib/containers/storage/overlay/c9ff2160e869759ed5accf9c0083ecdee7089af13b37b6df02d00c9a1db0a0d0/merged major:0 minor:1022 fsType:overlay blockSize:0} overlay_0-1031:{mountpoint:/var/lib/containers/storage/overlay/5bc8693a61c2660f19af34c6b3161ea04556c7386821cdcc41f19b4a8719e84c/merged major:0 minor:1031 fsType:overlay blockSize:0} overlay_0-1033:{mountpoint:/var/lib/containers/storage/overlay/a3900b638b79f2a4cf49993d1a45485e74e7ee97328cb373de8aea59d5605fb3/merged major:0 minor:1033 fsType:overlay blockSize:0} overlay_0-1043:{mountpoint:/var/lib/containers/storage/overlay/2aae114dda24b496fde13a05b9aa0e4cc7d4f4707010dca35004fe6fca72324b/merged major:0 minor:1043 fsType:overlay blockSize:0} overlay_0-1045:{mountpoint:/var/lib/containers/storage/overlay/e8ddb1aa5d945ff12d616a02c4b5b6d7efc69ee219ad1efd50784900de4c43c4/merged major:0 minor:1045 fsType:overlay blockSize:0} overlay_0-1047:{mountpoint:/var/lib/containers/storage/overlay/7729f7936e4e95382700d604ab709341d10a23f33862bc5006e4777889522ba7/merged major:0 minor:1047 fsType:overlay blockSize:0} overlay_0-106:{mountpoint:/var/lib/containers/storage/overlay/7533d1363ce56a324cb4a67a5d319d1d4669201bf1675e6db220fb965a9f44b7/merged major:0 minor:106 fsType:overlay blockSize:0} overlay_0-1060:{mountpoint:/var/lib/containers/storage/overlay/92b66e972d76f4929099e1c37a40bd9c9594c95c5ea24afb71170d3ff7ba6a95/merged major:0 minor:1060 fsType:overlay blockSize:0} overlay_0-1065:{mountpoint:/var/lib/containers/storage/overlay/2b8491bf17f50cb978b82b212a41bec2e0a9937d7ddbabbb2401fc54a81439df/merged major:0 minor:1065 fsType:overlay blockSize:0} overlay_0-110:{mountpoint:/var/lib/containers/storage/overlay/6b66ed16281c1b719f3287bcd884a25ac2dabe09179a6c22af1174f52bf6ca04/merged major:0 minor:110 fsType:overlay blockSize:0} overlay_0-112:{mountpoint:/var/lib/containers/storage/overlay/23a93820ca1690317702d4e6c70506039819ca9d2ddf2cbaeba949ff0bb862c1/merged major:0 minor:112 fsType:overlay blockSize:0} overlay_0-121:{mountpoint:/var/lib/containers/storage/overlay/489bfcd3569f9d8187ecc95eb7f291d5ac20dc52a2bee189664b83584ba05317/merged major:0 minor:121 fsType:overlay blockSize:0} overlay_0-131:{mountpoint:/var/lib/containers/storage/overlay/cdf5726ac9bbc23ba841d56213cc28b0a3c6c73db9f753757f2ddf2507aa3d0c/merged major:0 minor:131 fsType:overlay blockSize:0} overlay_0-134:{mountpoint:/var/lib/containers/storage/overlay/a6072901ca010a047d748840c2b3572d1d16bfcaae0373f1d1ac3ad7c0035e19/merged major:0 minor:134 fsType:overlay blockSize:0} overlay_0-136:{mountpoint:/var/lib/containers/storage/overlay/71c9736e4bc5ca9f88dad6c00409fd6cb150868a4bfe067bc45fbf88f9ae00b5/merged major:0 minor:136 fsType:overlay blockSize:0} overlay_0-138:{mountpoint:/var/lib/containers/storage/overlay/d42fc86a22fc4fb80db19f70f2a52fcdc596dc983441a58a418ef7c844ea8a02/merged major:0 minor:138 fsType:overlay blockSize:0} overlay_0-150:{mountpoint:/var/lib/containers/storage/overlay/e75fb4d94971874a2a28fbdc327c9ee9baf30584f0c8a37325f6069e861b5d7b/merged major:0 minor:150 fsType:overlay blockSize:0} overlay_0-152:{mountpoint:/var/lib/containers/storage/overlay/fa93ee114fa8cc99ab6c119235c7edc7592d9999956606ce78922f44d8212c17/merged major:0 minor:152 fsType:overlay blockSize:0} overlay_0-156:{mountpoint:/var/lib/containers/storage/overlay/6df502a15cf1f57796b1cc9bc6c2216153be29b6fa6456d93e4ffbf60c17d21a/merged major:0 minor:156 fsType:overlay blockSize:0} overlay_0-158:{mountpoint:/var/lib/containers/storage/overlay/424f3fc04f26f7ea36e8b8c6eb8150f93dc0e8cbe7c2c80ed2d46d0fb263f3e0/merged major:0 minor:158 fsType:overlay blockSize:0} overlay_0-160:{mountpoint:/var/lib/containers/storage/overlay/ed464e70c1c924f1e7f418decf4391de37859617ea5e2d14c677ffce3ce0f7f6/merged major:0 minor:160 fsType:overlay blockSize:0} overlay_0-161:{mountpoint:/var/lib/containers/storage/overlay/34ec4042a8534540453643a820a4365633d23eecb7bbc5c06d27a3b8d2301ad9/merged major:0 minor:161 fsType:overlay blockSize:0} overlay_0-163:{mountpoint:/var/lib/containers/storage/overlay/f5872fa103bddddb9bf8d0473cd26fb568f462219e708308b405da1efbcda47b/merged major:0 minor:163 fsType:overlay blockSize:0} overlay_0-168:{mountpoint:/var/lib/containers/storage/overlay/6d1e48ace8503297b6ccd188d9d7eea008cdab3061a399302ba39c4a45019b6b/merged major:0 minor:168 fsType:overlay blockSize:0} overlay_0-179:{mountpoint:/var/lib/containers/storage/overlay/9edd7bb7bcccf00dcbef38547e8467759a006a6d690ea56e91f434ad672f81f4/merged major:0 minor:179 fsType:overlay blockSize:0} overlay_0-184:{mountpoint:/var/lib/containers/storage/overlay/42358ee0c1f2587e1ed52ec295e4ee8d5455f1204b421f55778e08d8b811eec6/merged major:0 minor:184 fsType:overlay blockSize:0} overlay_0-189:{mountpoint:/var/lib/containers/storage/overlay/0b56162755b431c428f9e23caa8457b3bf48ddd32c73320b80f25e30265d5d89/merged major:0 minor:189 fsType:overlay blockSize:0} overlay_0-194:{mountpoint:/var/lib/containers/storage/overlay/ea5babb320eef546678135f8a74f981b4f4840c715901e430f4cf080676bc6ba/merged major:0 minor:194 fsType:overlay blockSize:0} overlay_0-195:{mountpoint:/var/lib/containers/storage/overlay/2de607e7a5c24db1e8008d42d38fe99b20f7d92932313596b520cdbacd1f7571/merged major:0 minor:195 fsType:overlay blockSize:0} overlay_0-204:{mountpoint:/var/lib/containers/storage/overlay/ed045d65e49d7809738b16be32503d6faf143c67e03339a480253d4a66dbde84/merged major:0 minor:204 fsType:overlay blockSize:0} overlay_0-270:{mountpoint:/var/lib/containers/storage/overlay/b290cf55ff09c4011fffa798f55fbe574b217cc18a5ab109a63dcebad4af856c/merged major:0 minor:270 fsType:overlay blockSize:0} overlay_0-273:{mountpoint:/var/lib/containers/storage/overlay/1c812f33eb9e0c2a0660f674ec01ba46cdd4ed2df9fc5a0ef9e75f6ba5e6638f/merged major:0 minor:273 fsType:overlay blockSize:0} overlay_0-277:{mountpoint:/var/lib/containers/storage/overlay/90fd51772ea8b7b709d941265f653789d20161a1a381923b7c7c5b55d906f3e8/merged major:0 minor:277 fsType:overlay blockSize:0} overlay_0-282:{mountpoint:/var/lib/containers/storage/overlay/8922805a93a92aac6242ff187acac2482a88e6dd56553c0afbc79438dc725fc8/merged major:0 minor:282 fsType:overlay blockSize:0} overlay_0-286:{mountpoint:/var/lib/containers/storage/overlay/de59684b638b249a113e3d5b0a0e8aaeb24e8c656a899ffeafc574a95e26d8f3/merged major:0 minor:286 fsType:overlay blockSize:0} overlay_0-288:{mountpoint:/var/lib/containers/storage/overlay/2b889bc5bb607afdd0d5186e24d2ca769fc69f4bda82ae98a23ecd79e9d704f9/merged major:0 minor:288 fsType:overlay blockSize:0} overlay_0-290:{mountpoint:/var/lib/containers/storage/overlay/fd8de545152e072078cfae3a6187ddd6e862c45f4e95517cc01b13fc8f455bfc/merged major:0 minor:290 fsType:overlay blockSize:0} overlay_0-292:{mountpoint:/var/lib/containers/storage/overlay/2edb4ba0341ad5bbfe8288f4ec3ec6cb90c4be0fee73b247556b92dd6b610774/merged major:0 minor:292 fsType:overlay blockSize:0} overlay_0-294:{mountpoint:/var/lib/containers/storage/overlay/55eecc794b0f48d3a787866a729273a0781315507f097db0da825a80f9349ab0/merged major:0 minor:294 fsType:overlay blockSize:0} overlay_0-296:{mountpoint:/var/lib/containers/storage/overlay/6f5eba7d48e9ee11fb19e6ed56ebb69cec842347c77a23212d5c77e84bd34e8a/merged major:0 minor:296 fsType:overlay blockSize:0} overlay_0-298:{mountpoint:/var/lib/containers/storage/overlay/76efd0faae8987d9626fbdac9839331c62b9455c87575dbfd8039f71ac81ff16/merged major:0 minor:298 fsType:overlay blockSize:0} overlay_0-300:{mountpoint:/var/lib/containers/storage/overlay/bd89bf70457c0752bb085ec0c8c9e4914cb3f1ed0c242461a9211154f929b0f6/merged major:0 minor:300 fsType:overlay blockSize:0} overlay_0-303:{mountpoint:/var/lib/containers/storage/overlay/07b3f35b86a00481ee25a8d97659f34aa0cc7552a132ba6dd4515e647adff6d6/merged major:0 minor:303 fsType:overlay blockSize:0} overlay_0-311:{mountpoint:/var/lib/containers/storage/overlay/f5ea5cd0f039335f82e090f15498496192a753ffbecfb07b6397a7fce9d9679e/merged major:0 minor:311 fsType:overlay blockSize:0} overlay_0-313:{mountpoint:/var/lib/containers/storage/overlay/a7944e0e5f58da72490fa13678b9adbc0c2f12b65106aa35d134f8e010f4a4f6/merged major:0 minor:313 fsType:overlay blockSize:0} overlay_0-319:{mountpoint:/var/lib/containers/storage/overlay/810ea160266a54c1abbdbc1a503bf953917e4a73ce850d8a6f8bda73fe4b8fe6/merged major:0 minor:319 fsType:overlay blockSize:0} overlay_0-323:{mountpoint:/var/lib/containers/storage/overlay/a5a1801047d7b7748cf1adac1594e8e0cc0003ed8b941eda7169fd3c4fca9ccc/merged major:0 minor:323 fsType:overlay blockSize:0} overlay_0-325:{mountpoint:/var/lib/containers/storage/overlay/6b3e9b191eb38d6984e836bb55c095758d359a948c26b81e67b555d9faa40da7/merged major:0 minor:325 fsType:overlay blockSize:0} overlay_0-328:{mountpoint:/var/lib/containers/storage/overlay/b41141a2257b3039847635f06d23e494511a807d3bff475a158b1bc315c7b41c/merged major:0 minor:328 fsType:overlay blockSize:0} overlay_0-352:{mountpoint:/var/lib/containers/storage/overlay/8a8d058f6624de44f55985737f6e5b6092fada6b9d49326a86b636b8a70f6cee/merged major:0 minor:352 fsType:overlay blockSize:0} overlay_0-354:{mountpoint:/var/lib/containers/storage/overlay/f1d29b46c5ced04cc8f99e6a019b3c46f9a043a17e78597d3508db8e04b662b6/merged major:0 minor:354 fsType:overlay blockSize:0} overlay_0-360:{mountpoint:/var/lib/containers/storage/overlay/b225385ac7cfc3f8daf36c5309676069e3cbff63045c0e418db0bb2378c08e71/merged major:0 minor:360 fsType:overlay blockSize:0} overlay_0-367:{mountpoint:/var/lib/containers/storage/overlay/9dfde2885cdea074ab37b19439d8aad4c6a2f098059f7e58caeac616857ea40b/merged major:0 minor:367 fsType:overlay blockSize:0} overlay_0-369:{mountpoint:/var/lib/containers/storage/overlay/b15afca977f07d120ccff83c297b71720596990d73ce3fb4df469ed0fc3b93e2/merged major:0 minor:369 fsType:overlay blockSize:0} overlay_0-387:{mountpoint:/var/lib/containers/storage/overlay/b88b8b135b6977d0a00bbb117c320838b489fe7dff5513f2a9db56b8ce6f817e/merged major:0 minor:387 fsType:overlay blockSize:0} overlay_0-399:{mountpoint:/var/lib/containers/storage/overlay/453d7643cf5ba750b6e65ae764b1cf6c6771fc7b9f4e826109e1a4fcdf379c15/merged major:0 minor:399 fsType:overlay blockSize:0} overlay_0-401:{mountpoint:/var/lib/containers/storage/overlay/1ba3f2feaaf85d2a24ab58deb05bc847996e1da84c034882a54483d63bc6e953/merged major:0 minor:401 fsType:overlay blockSize:0} overlay_0-404:{mountpoint:/var/lib/containers/storage/overlay/73c268d76777207a6c6150a671ad4cd2a2fc2222158a468fa653332929963c27/merged major:0 minor:404 fsType:overlay blockSize:0} overlay_0-406:{mountpoint:/var/lib/containers/storage/overlay/fd26f8c4e955a44522ea00565f711185a1f2739156c1ccc528221da6e73bb3d4/merged major:0 minor:406 fsType:overlay blockSize:0} overlay_0-408:{mountpoint:/var/lib/containers/storage/overlay/fcde951c6f9b58aebd84585b1d979598c4c6d5fe01ffec9a40d80c987cd79dab/merged major:0 minor:408 fsType:overlay blockSize:0} overlay_0-410:{mountpoint:/var/lib/containers/storage/overlay/93da4333e277982fcf070b2a77e635238f586176ddcc34acd3b2a647dcb0abe5/merged major:0 minor:410 fsType:overlay blockSize:0} overlay_0-412:{mountpoint:/var/lib/containers/storage/overlay/0ab057a6662550dffa4b43723ae162cb45c025daa5d8a612738f661a34256b1a/merged major:0 minor:412 fsType:overlay blockSize:0} overlay_0-414:{mountpoint:/var/lib/containers/storage/overlay/b1334c4784d051f0cac0637a972119712ae22fb44c3a0b22a0bd5eb4843cd641/merged major:0 minor:414 fsType:overlay blockSize:0} overlay_0-423:{mountpoint:/var/lib/containers/storage/overlay/6094ce8bf7be11e665a419711f8e17eec6421d31d1f07f99e7bb3419dc9b0866/merged major:0 minor:423 fsType:overlay blockSize:0} overlay_0-425:{mountpoint:/var/lib/containers/storage/overlay/4067c973f0342e3ed3e15e8561a8cbb30f0ed3e0836d91098a8fd0ba50b31bff/merged major:0 minor:425 fsType:overlay blockSize:0} overlay_0-440:{mountpoint:/var/lib/containers/storage/overlay/f15da4aeb33ed932a5372fce1d214f6d730339e3027fcfdbd96e94b80ac468b2/merged major:0 minor:440 fsType:overlay blockSize:0} overlay_0-442:{mountpoint:/var/lib/containers/storage/overlay/22577890cf40cc40d7f8a5940db7c0149ad38895d3f6fd2dfefc436f0bcf18ea/merged major:0 minor:442 fsType:overlay blockSize:0} overlay_0-444:{mountpoint:/var/lib/containers/storage/overlay/1c500797e41ab9fe0e97fc773716af1c109f6119bcac8f27bb52f0f4f4cbd9f2/merged major:0 minor:444 fsType:overlay blockSize:0} overlay_0-45:{mountpoint:/var/lib/containers/storage/overlay/758caadcaa1cfa8e74286612cdd57a6b939480aafc06d6e35dd5fd2f5208bbb4/merged major:0 minor:45 fsType:overlay blockSize:0} overlay_0-450:{mountpoint:/var/lib/containers/storage/overlay/2f49eb166e0d7dfacb3310e2d75ec50443edc1d8e1dfbcf73c3fe80dd42c2aa4/merged major:0 minor:450 fsType:overlay blockSize:0} overlay_0-452:{mountpoint:/var/lib/containers/storage/overlay/6f7a557bfdf1255d0b527769493bd389c62ce75d235236eea8983ca2e30a3341/merged major:0 minor:452 fsType:overlay blockSize:0} overlay_0-454:{mountpoint:/var/lib/containers/storage/overlay/dad84b9dd59057468c11824a38b47dfaa2d775861f79d76103e70d9e283e6404/merged major:0 minor:454 fsType:overlay blockSize:0} overlay_0-475:{mountpoint:/var/lib/containers/storage/overlay/2f6ea105d4a73e3098c3d6953de75e92894e2283bfa73bc4f49776f381e6df3c/merged major:0 minor:475 fsType:overlay blockSize:0} overlay_0-48:{mountpoint:/var/lib/containers/storage/overlay/0ebfffe5350972289d1771df2487dd0f590848f7ef66c5a6462895eb05db8bbc/merged major:0 minor:48 fsType:overlay blockSize:0} overlay_0-498:{mountpoint:/var/lib/containers/storage/overlay/d4c7a6022e3658bb23d7f31ada09416cc82ca00ffeb7b60534206b1ae7bc17f2/merged major:0 minor:498 fsType:overlay blockSize:0} overlay_0-50:{mountpoint:/var/lib/containers/storage/overlay/579b5aaf977fbba305837afed01e3f9df9d05271e06260dc41b9ca1053a62f52/merged major:0 minor:50 fsType:overlay blockSize:0} overlay_0-501:{mountpoint:/var/lib/containers/storage/overlay/2169a72ed1e7f24a4204c07f5a035eaabc963e73ef4afec3e2da2e1097010b3b/merged major:0 minor:501 fsType:overlay blockSize:0} overlay_0-502:{mountpoint:/var/lib/containers/storage/overlay/ee65467ea6813907a71202a295cb412ca5ee3f6e22ed0c5c5e26f6a7591bff80/merged major:0 minor:502 fsType:overlay blockSize:0} overlay_0-504:{mountpoint:/var/lib/containers/storage/overlay/11853006fc2572acb4038e38c3d8a597e92c2866305acb6301eaaed9b29b20d1/merged major:0 minor:504 fsType:overlay blockSize:0} overlay_0-506:{mountpoint:/var/lib/containers/storage/overlay/110ff7ef9316945ed2c5c1f6c748384a875fdc59486890c6856da24c4e3d20af/merged major:0 minor:506 fsType:overlay blockSize:0} overlay_0-508:{mountpoint:/var/lib/containers/storage/overlay/7fb1ddbc2a93dbead17b17261f22aff79f4f53afd8f6623bfaef39897bf3e5a4/merged major:0 minor:508 fsType:overlay blockSize:0} overlay_0-510:{mountpoint:/var/lib/containers/storage/overlay/217890029d51239bda03dfe0149f0c39564f905519fcdcae0dff3257c0c3e279/merged major:0 minor:510 fsType:overlay blockSize:0} overlay_0-514:{mountpoint:/var/lib/containers/storage/overlay/b59d3568c7461fa9701895aad02fb6a7fd357c0f30fe2366290502eb6625d591/merged major:0 minor:514 fsType:overlay blockSize:0} overlay_0-517:{mountpoint:/var/lib/containers/storage/overlay/eb35a4a27074fb3b5329817d0f0c753a6a0718bed4095aa33475cd0f4423ef56/merged major:0 minor:517 fsType:overlay blockSize:0} overlay_0-52:{mountpoint:/var/lib/containers/storage/overlay/41380adb02e3e0d246d8a5a43b483971f80cbee116b60932e63e3f6fd6d92c39/merged major:0 minor:52 fsType:overlay blockSize:0} overlay_0-527:{mountpoint:/var/lib/containers/storage/overlay/0feace04e138039dd76c9058eb86a578530d1251bd1dfb6b7885e0f43e4fbf3d/merged major:0 minor:527 fsType:overlay blockSize:0} overlay_0-535:{mountpoint:/var/lib/containers/storage/overlay/7db450807ef46ec38c0a8fcc47f44ec4a9923803926585e070a105cbc3e92ea2/merged major:0 minor:535 fsType:overlay blockSize:0} overlay_0-537:{mountpoint:/var/lib/containers/storage/overlay/a018cf1e46795aac273f12d3d279f40fdd4ea67a37a2fa2e5f2350452eef1593/merged major:0 minor:537 fsType:overlay blockSize:0} overlay_0-539:{mountpoint:/var/lib/containers/storage/overlay/be5f8c0f86203209bcd3d12aa2a4bcbd96d95ef8e19c5a88bb0541c67592e9e2/merged major:0 minor:539 fsType:overlay blockSize:0} overlay_0-541:{mountpoint:/var/lib/containers/storage/overlay/3d34b8c17c8538ccdbaeaf231c2bfe40ae0c5ddc3201c5c6b608bef915ef3299/merged major:0 minor:541 fsType:overlay blockSize:0} overlay_0-545:{mountpoint:/var/lib/containers/storage/overlay/6cdd18126b437c92e8431fcdc9dcf38d2f6668a0ae40831d299934bedf932aa3/merged major:0 minor:545 fsType:overlay blockSize:0} overlay_0-547:{mountpoint:/var/lib/containers/storage/overlay/220e24fee4a54bbfb5d025ff95e3d6a34da44c0d9ea69e45b178e92ebd66180d/merged major:0 minor:547 fsType:overlay blockSize:0} overlay_0-549:{mountpoint:/var/lib/containers/storage/overlay/48ad141827d61be53cbc5aaa56ba58f3094dc7a90097abe077be17448140674a/merged major:0 minor:549 fsType:overlay blockSize:0} overlay_0-56:{mountpoint:/var/lib/containers/storage/overlay/109aee8e44bb77337a04ad611ec2fb23ebd5ba3d7c33da6649d49925374fdb9f/merged major:0 minor:56 fsType:overlay blockSize:0} overlay_0-561:{mountpoint:/var/lib/containers/storage/overlay/b41743b87ec7219d6c9c120a33dbc6dcfdad94b44c4b9d3700329b7980a2c412/merged major:0 minor:561 fsType:overlay blockSize:0} overlay_0-566:{mountpoint:/var/lib/containers/storage/overlay/5cd3234e124166a919c8c0ddddf3ce1b317384205950cfa35af6348e045a2993/merged major:0 minor:566 fsType:overlay blockSize:0} overlay_0-570:{mountpoint:/var/lib/containers/storage/overlay/cc3520007a86acb77b60191027ed8c550640b4a6573b838d6e54c8e2dc8180f3/merged major:0 minor:570 fsType:overlay blockSize:0} overlay_0-580:{mountpoint:/var/lib/containers/storage/overlay/6bc4958a02d0925a968ab513027fa42bbd853d10804106f614accaec03e3878c/merged major:0 minor:580 fsType:overlay blockSize:0} overlay_0-595:{mountpoint:/var/lib/containers/storage/overlay/29ef63fdbf600988e3edce6276d09014af90459383e4ecdea479fd75faf8a0e6/merged major:0 minor:595 fsType:overlay blockSize:0} overlay_0-597:{mountpoint:/var/lib/containers/storage/overlay/3de108a256813455ecaa63c707a379e9264017ebb50be07c44854eab0e535498/merged major:0 minor:597 fsType:overlay blockSize:0} overlay_0-599:{mountpoint:/var/lib/containers/storage/overlay/0346d13b2e0f74ce34e4baaa17937d2d6a256feb79ce82b698577fe46c2c8b22/merged major:0 minor:599 fsType:overlay blockSize:0} overlay_0-603:{mountpoint:/var/lib/containers/storage/overlay/858b269e9ba282da7335209dbaa4c4e88acd7ba65fce1d952f721228e1bec6a0/merged major:0 minor:603 fsType:overlay blockSize:0} overlay_0-618:{mountpoint:/var/lib/containers/storage/overlay/ba8703b002fe7449958e814287aa61f545e680ff8502daed5e9f145f836c53f8/merged major:0 minor:618 fsType:overlay blockSize:0} overlay_0-630:{mountpoint:/var/lib/containers/storage/overlay/29bfe49bef4f1558454b515b6df688b46107a815b7298baad3459e174b485cd9/merged major:0 minor:630 fsType:overlay blockSize:0} overlay_0-632:{mountpoint:/var/lib/containers/storage/overlay/e289f34926d7c5d8541892d7fb801206bea6b5429cfc684900fe9a1a80e38012/merged major:0 minor:632 fsType:overlay blockSize:0} overlay_0-634:{mountpoint:/var/lib/containers/storage/overlay/a68ec1b271a42f47934f506c165b3887c166a753a66492d0b02f8d3ffa079255/merged major:0 minor:634 fsType:overlay blockSize:0} overlay_0-643:{mountpoint:/var/lib/containers/storage/overlay/fbdf851df3998b4757cdbd84cc332495e6bdfa49012fda76d93e2aa28fd1045c/merged major:0 minor:643 fsType:overlay blockSize:0} overlay_0-65:{mountpoint:/var/lib/containers/storage/overlay/9de38ff6ec25f5bc5ae563bf5c73645261869b08ab41680171857630d4eb0458/merged major:0 minor:65 fsType:overlay blockSize:0} overlay_0-658:{mountpoint:/var/lib/containers/storage/overlay/eeeee1f691a92feb84ce20c6d5a82de0b9156b16b3424f4e1a41d52fe15bd0f1/merged major:0 minor:658 fsType:overlay blockSize:0} overlay_0-663:{mountpoint:/var/lib/containers/storage/overlay/664e2958a34aa452ee0fac69095d4f757ab8ccc305ab85b1524172bcc960a914/merged major:0 minor:663 fsType:overlay blockSize:0} overlay_0-665:{mountpoint:/var/lib/containers/storage/overlay/1d66f0025bac4117bf80939c0ec36a7531f31045e11e45ddef8f7a550e1d71ac/merged major:0 minor:665 fsType:overlay blockSize:0} overlay_0-674:{mountpoint:/var/lib/containers/storage/overlay/19cc711b01f1241b50c197aa05fd806062af267b9779067edf458e7b680be298/merged major:0 minor:674 fsType:overlay blockSize:0} overlay_0-69:{mountpoint:/var/lib/containers/storage/overlay/44c51460ae55a988ee5f3cf580cb457b2fbd9318dea7c493f1d18f93443c78c7/merged major:0 minor:69 fsType:overlay blockSize:0} overlay_0-690:{mountpoint:/var/lib/containers/storage/overlay/9ee1806452c35fd91d64ec79eb2d1624684d21fb13c7179e2dc510052ada5510/merged major:0 minor:690 fsType:overlay blockSize:0} overlay_0-693:{mountpoint:/var/lib/containers/storage/overlay/4ae2a9823d4394f662aa087e5d21049a3b36b6eea321d351348191761a96f1c4/merged major:0 minor:693 fsType:overlay blockSize:0} overlay_0-698:{mountpoint:/var/lib/containers/storage/overlay/8bf354574dd9d007191684661fd35e77c91df1c2929729a37e6b3fee6c7971c8/merged major:0 minor:698 fsType:overlay blockSize:0} overlay_0-701:{mountpoint:/var/lib/containers/storage/overlay/958bf32938aff901384b7b61b4327a00220d3d49ebf875d6be2d8d46703e9717/merged major:0 minor:701 fsType:overlay blockSize:0} overlay_0-703:{mountpoint:/var/lib/containers/storage/overlay/b23895c605fb034be55c2b3edb6bbf6c06f74c54ed8c793c07101d95f113baca/merged major:0 minor:703 fsType:overlay blockSize:0} overlay_0-705:{mountpoint:/var/lib/containers/storage/overlay/d12ce40995470535125dd62745592ace116bbd4354d3d5abb60d6de76e5f2ef3/merged major:0 minor:705 fsType:overlay blockSize:0} overlay_0-707:{mountpoint:/var/lib/containers/storage/overlay/5ec47ef6aa208aaa7e29ed3d1841805ad46d405807eac38fc7d8e9de61d5d38f/merged major:0 minor:707 fsType:overlay blockSize:0} overlay_0-708:{mountpoint:/var/lib/containers/storage/overlay/149dcb62a1fdd9a5f8567da163f1c68f921254007a14bcaa5c47860dd72660fc/merged major:0 minor:708 fsType:overlay blockSize:0} overlay_0-71:{mountpoint:/var/lib/containers/storage/overlay/7dbe2d26f747f79c1fb642ffeaf7739ec37f6f74a95f7fabcdbc727e015df54c/merged major:0 minor:71 fsType:overlay blockSize:0} overlay_0-710:{mountpoint:/var/lib/containers/storage/overlay/6afa3bad2cef8d9d1fd03ec1772a12486fce3584e613264c945511adf5ab30fe/merged major:0 minor:710 fsType:overlay blockSize:0} overlay_0-712:{mountpoint:/var/lib/containers/storage/overlay/7bcd33ed246f6f595b74a69a3b1c71e20ccb31971e12b2c20eeba4f6cf6b8a82/merged major:0 minor:712 fsType:overlay blockSize:0} overlay_0-714:{mountpoint:/var/lib/containers/storage/overlay/585e85eae6588242802e0ad4513fc3f18e1b854d9df512d603d0cd5cc52195b9/merged major:0 minor:714 fsType:overlay blockSize:0} overlay_0-716:{mountpoint:/var/lib/containers/storage/overlay/965fdb60bc8163c2eec5319058952bbced75c24869ef3e9d7dea51a03024092c/merged major:0 minor:716 fsType:overlay blockSize:0} overlay_0-719:{mountpoint:/var/lib/containers/storage/overlay/04216777e614d05eac2c14833266bdf02e4afb34ecbadabd2b46dbb278d42132/merged major:0 minor:719 fsType:overlay blockSize:0} overlay_0-722:{mountpoint:/var/lib/containers/storage/overlay/2d21c8b74b74e8117c6ea31c5e81ad463831b982fbd9b84f044b99127c1a9c0e/merged major:0 minor:722 fsType:overlay blockSize:0} overlay_0-724:{mountpoint:/var/lib/containers/storage/overlay/fd6cbbcbb6bef5a5a8637073ea2625c1772c43c5e99fd33bbe0346167691e899/merged major:0 minor:724 fsType:overlay blockSize:0} overlay_0-736:{mountpoint:/var/lib/containers/storage/overlay/8800b61127beb1a652273e43289237330b605300112577b9c056678d333f0595/merged major:0 minor:736 fsType:overlay blockSize:0} overlay_0-750:{mountpoint:/var/lib/containers/storage/overlay/4d650c8af1a383593361e038a83bc682be00a2954a5349c45f3d5ef7777f5d7a/merged major:0 minor:750 fsType:overlay blockSize:0} overlay_0-756:{mountpoint:/var/lib/containers/storage/overlay/f046f54116cbd98e03ddff0f9c192db289692dbc6b76970c488288b8640954d4/merged major:0 minor:756 fsType:overlay blockSize:0} overlay_0-758:{mountpoint:/var/lib/containers/storage/overlay/94f858c6f33ca984336c7b6b23986f2229621e423f8010d9868b53312b6908ac/merged major:0 minor:758 fsType:overlay blockSize:0} overlay_0-76:{mountpoint:/var/lib/containers/storage/overlay/16bde1e67c84c81b1adba8440d6dc1e2ad6a1eef154b9dab3d2b8f582f71d6fa/merged major:0 minor:76 fsType:overlay blockSize:0} overlay_0-761:{mountpoint:/var/lib/containers/storage/overlay/51d2f26591bdb68de62bc8ba448d6cf3f8efa862d930cada32447f9a9ef15b59/merged major:0 minor:761 fsType:overlay blockSize:0} overlay_0-774:{mountpoint:/var/lib/containers/storage/overlay/33207758f79e2cca51ac3acbc21198dd869b3d0531668b4fd3f25eeac12ba690/merged major:0 minor:774 fsType:overlay blockSize:0} overlay_0-783:{mountpoint:/var/lib/containers/storage/overlay/3b0a3166d6e19609a7a5e70dccd8df14de2548a2d27e0b9a690a9745fa10b7ed/merged major:0 minor:783 fsType:overlay blockSize:0} overlay_0-786:{mountpoint:/var/lib/containers/storage/overlay/099f01fc813755b61ea9a8203bb4ed82d4721bb4b9673732f149dea930d99c76/merged major:0 minor:786 fsType:overlay blockSize:0} overlay_0-806:{mountpoint:/var/lib/containers/storage/overlay/240ebb859a83827347ec22ae4f7eaa253e5646efa1f95eacd3ec3e218ab79397/merged major:0 minor:806 fsType:overlay blockSize:0} overlay_0-809:{mountpoint:/var/lib/containers/storage/overlay/870224b8f2e12f66a2f3a57543a6f0bfdb1cd0a7f01ecd5fdefbed70e7e3951b/merged major:0 minor:809 fsType:overlay blockSize:0} overlay_0-824:{mountpoint:/var/lib/containers/storage/overlay/2f7e7c7524b0a3b40a83222d229ef04ce3df21532f5fbceb1ea84dbb91b18bd6/merged major:0 minor:824 fsType:overlay blockSize:0} overlay_0-827:{mountpoint:/var/lib/containers/storage/overlay/b8554eff435f30ce19fed3f6a011261dcf6bb50cf65ce37c6dc9d7ef6544828b/merged major:0 minor:827 fsType:overlay blockSize:0} overlay_0-829:{mountpoint:/var/lib/containers/storage/overlay/33b73eb276dfbe9e3f980a923841315e47823eee0d7db6b4857a5b7a0857d3bd/merged major:0 minor:829 fsType:overlay blockSize:0} overlay_0-832:{mountpoint:/var/lib/containers/storage/overlay/3d41f30badac15258cb44bb70e680a66cb91eb8dd853083849a1467b05acb19a/merged major:0 minor:832 fsType:overlay blockSize:0} overlay_0-836:{mountpoint:/var/lib/containers/storage/overlay/65924ec925e16de2e6e94cb040bed0c7ba9e1aafb3db3c1e0e2bdd1030dd6f82/merged major:0 minor:836 fsType:overlay blockSize:0} overlay_0-838:{mountpoint:/var/lib/containers/storage/overlay/fbba40958a4fc4de4b3ae2cfe8382c32dedd20ec816451e134c2569c38962e68/merged major:0 minor:838 fsType:overlay blockSize:0} overlay_0-840:{mountpoint:/var/lib/containers/storage/overlay/19848ab2bf95924f922af3eac697281b4fc58b41f91182ee8a8f8a7ccbd15736/merged major:0 minor:840 fsType:overlay blockSize:0} overlay_0-841:{mountpoint:/var/lib/containers/storage/overlay/dd5b2d83020e91e886e18d865aabd627945163dc60a134e2ee925884bb0c6cce/merged major:0 minor:841 fsType:overlay blockSize:0} overlay_0-843:{mountpoint:/var/lib/containers/storage/overlay/a11d4e429771204132f79cbfe12238c7530a5db035a5dc2100bf857f2aaa537a/merged major:0 minor:843 fsType:overlay blockSize:0} overlay_0-846:{mountpoint:/var/lib/containers/storage/overlay/ec3cd1e2aa5ab3cc9fca79a19a0ae42678d7903cc38a432daf4151731e472228/merged major:0 minor:846 fsType:overlay blockSize:0} overlay_0-853:{mountpoint:/var/lib/containers/storage/overlay/c694ce911a12dd1658fdc1f53bf9d98075d12fe0a4caefe4c31be66b54aea087/merged major:0 minor:853 fsType:overlay blockSize:0} overlay_0-855:{mountpoint:/var/lib/containers/storage/overlay/85755cbdffd37cf8dd0483de92c4bb339f3df3e4ad043043c3ed8b659080b24e/merged major:0 minor:855 fsType:overlay blockSize:0} overlay_0-859:{mountpoint:/var/lib/containers/storage/overlay/41c248dba048fef5bac4ac1537f297b2c304139a81227968ee1cdd5742aa3bed/merged major:0 minor:859 fsType:overlay blockSize:0} overlay_0-86:{mountpoint:/var/lib/containers/storage/overlay/69a00dc3578e05837fe9c0d0c53d1f6edb3b93cbf7d110a67df52f1cb1a4dc9e/merged major:0 minor:86 fsType:overlay blockSize:0} overlay_0-862:{mountpoint:/var/lib/containers/storage/overlay/cc1bf12b77e5900105876808af658fba1360161f338cd61abd6ac1da46ee79cb/merged major:0 minor:862 fsType:overlay blockSize:0} overlay_0-865:{mountpoint:/var/lib/containers/storage/overlay/e2c2ba821b038cce7aedfd194a0922b725b87f0209b98e7fc7c7ea90b4553671/merged major:0 minor:865 fsType:overlay blockSize:0} overlay_0-867:{mountpoint:/var/lib/containers/storage/overlay/f8a9f7b3aafd2d9d0c7d6c8186253736dec058a20cca84913694901b1ceefc86/merged major:0 minor:867 fsType:overlay blockSize:0} overlay_0-869:{mountpoint:/var/lib/containers/storage/overlay/bb7809c234e0b40717710a1cacdc6af62cf1d39c9efa1a65bc449314858cf8d6/merged major:0 minor:869 fsType:overlay blockSize:0} overlay_0-885:{mountpoint:/var/lib/containers/storage/overlay/d74e12de47c3022a1dd50cd84cdb7422e58aab16f932b2d8202e93f2aea784c9/merged major:0 minor:885 fsType:overlay blockSize:0} overlay_0-893:{mountpoint:/var/lib/containers/storage/overlay/15336422cd4ad665ecbb2f3d4d0c101ceed672580a4731c6796325c93a57fa77/merged major:0 minor:893 fsType:overlay blockSize:0} overlay_0-897:{mountpoint:/var/lib/containers/storage/overlay/02c5d2d5543cc62e562588e99531edaa904d0f48a922d59befbf70d79dc687e0/merged major:0 minor:897 fsType:overlay blockSize:0} overlay_0-899:{mountpoint:/var/lib/containers/storage/overlay/295e45dc57b8e0f9a5d2dd8210d4ef5fec8fb56ef498540a25919dec991cae33/merged major:0 minor:899 fsType:overlay blockSize:0} overlay_0-901:{mountpoint:/var/lib/containers/storage/overlay/8514ee41674184627b2c7f4d49c7e8a0403b747427a4f7ad2dc82c8ba322db48/merged major:0 minor:901 fsType:overlay blockSize:0} overlay_0-903:{mountpoint:/var/lib/containers/storage/overlay/f1e04d2bb81ae040840934725dbfb95916e256ff2b115c41c9cb39a7c5355046/merged major:0 minor:903 fsType:overlay blockSize:0} overlay_0-905:{mountpoint:/var/lib/containers/storage/overlay/34ae9b89e848bf02d24c2bab951f424eab9aa22f2f01582deb3502cf7acd3bb6/merged major:0 minor:905 fsType:overlay blockSize:0} overlay_0-913:{mountpoint:/var/lib/containers/storage/overlay/561e3f3822ee88faee37b3b86bf2b9043092bc17ffb3fe1468c5b48ec729439b/merged major:0 minor:913 fsType:overlay blockSize:0} overlay_0-918:{mountpoint:/var/lib/containers/storage/overlay/0d580d8f00ee009e476c361059b2719ab85dd7f7f761940e07a8c745e2869416/merged major:0 minor:918 fsType:overlay blockSize:0} overlay_0-92:{mountpoint:/var/lib/containers/storage/overlay/b9289914432c8fb0ef05334628a1924332bbfac0abf1cca1ed3579efe549b237/merged major:0 minor:92 fsType:overlay blockSize:0} overlay_0-943:{mountpoint:/var/lib/containers/storage/overlay/9adf53eaaa160ed4207c808af271f8f080d1e249ebf0a82279daf35acfe0ec3d/merged major:0 minor:943 fsType:overlay blockSize:0} overlay_0-951:{mountpoint:/var/lib/containers/storage/overlay/fff90ab544c8c97ff49328e5e16dd90d973aa252344e782cb82f9352ec1ba54d/merged major:0 minor:951 fsType:overlay blockSize:0} overlay_0-953:{mountpoint:/var/lib/containers/storage/overlay/af0be4e8f0f380b20208f9f6a193c7dc372b66ac07ee6e5d0c26f0794c574193/merged major:0 minor:953 fsType:overlay blockSize:0} overlay_0-958:{mountpoint:/var/lib/containers/storage/overlay/620e397a8602003921a573e85e17f91095d28754a10d3f84d7a0bbe85a1ce45f/merged major:0 minor:958 fsType:overlay blockSize:0} overlay_0-96:{mountpoint:/var/lib/containers/storage/overlay/d9fe88f9718bc2c81dbf6bf894a344073153e91a43a11d4096fa11de155ecc1a/merged major:0 minor:96 fsType:overlay blockSize:0} overlay_0-967:{mountpoint:/var/lib/containers/storage/overlay/5ed9aa80323a5b7ee659b90a2fe473e972d8f74a6f9148e1c007825b5df560ca/merged major:0 minor:967 fsType:overlay blockSize:0} overlay_0-969:{mountpoint:/var/lib/containers/storage/overlay/82d75621dd71f5d15dfa85435237f03e7bd2f8f58cfae7a481e0a00dbf9c8963/merged major:0 minor:969 fsType:overlay blockSize:0} overlay_0-971:{mountpoint:/var/lib/containers/storage/overlay/d41d53de4cc6fde5a375bdcdd254467c7541a92efe417c3276cbe5c99d7dec2a/merged major:0 minor:971 fsType:overlay blockSize:0} overlay_0-976:{mountpoint:/var/lib/containers/storage/overlay/9a3f0c13051c5ad7691b947db52cec789c7f9c5d10e95bc2f4199bb736add347/merged major:0 minor:976 fsType:overlay blockSize:0} overlay_0-996:{mountpoint:/var/lib/containers/storage/overlay/e72be50e1ed0dd9d0bd97639cce967bc719687402eb83e435b36084f00aa20e7/merged major:0 minor:996 fsType:overlay blockSize:0} overlay_0-998:{mountpoint:/var/lib/containers/storage/overlay/c7df82120d71160859d8fd8351ec304b57d1ae29bcf3469146a520d3bfd69213/merged major:0 minor:998 fsType:overlay blockSize:0}] Mar 07 21:17:57.118136 master-0 kubenswrapper[16352]: I0307 21:17:57.113958 16352 manager.go:217] Machine: {Timestamp:2026-03-07 21:17:57.1124118 +0000 UTC m=+0.183116899 CPUVendorID:AuthenticAMD NumCores:16 NumPhysicalCores:1 NumSockets:16 CpuFrequency:2799998 MemoryCapacity:50514145280 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:fd388b0a7ee840b7a9a8619058f28513 SystemUUID:fd388b0a-7ee8-40b7-a9a8-619058f28513 BootID:1e0d9bad-17ce-4467-8d98-7b297ec5d412 Filesystems:[{Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~secret/ovn-node-metrics-cert DeviceMajor:0 DeviceMinor:126 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volumes/kubernetes.io~projected/kube-api-access-tpztb DeviceMajor:0 DeviceMinor:127 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7ac62432ddbefa836db6b7adb92368df7a58058d250dbdf00dc899851d8a07e1/userdata/shm DeviceMajor:0 DeviceMinor:525 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~projected/kube-api-access-87fml DeviceMajor:0 DeviceMinor:617 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-943 DeviceMajor:0 DeviceMinor:943 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/335b9f1124f039a1fd483115d3e476453c7ffa85e4ff68ee05e93130ee63f663/userdata/shm DeviceMajor:0 DeviceMinor:965 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~projected/kube-api-access-kqwrr DeviceMajor:0 DeviceMinor:240 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~secret/signing-key DeviceMajor:0 DeviceMinor:415 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/57c6ee9a56cc57dee4a273a8e3079576dd2e072ab358ac7c30617c7193ed9144/userdata/shm DeviceMajor:0 DeviceMinor:788 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:871 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/stats-auth DeviceMajor:0 DeviceMinor:1010 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:216 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-286 DeviceMajor:0 DeviceMinor:286 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-510 DeviceMajor:0 DeviceMinor:510 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5625eb9f-c80b-47b1-b70c-aa636fbc03ac/volumes/kubernetes.io~projected/kube-api-access-khdpn DeviceMajor:0 DeviceMinor:796 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~projected/kube-api-access-k2gv7 DeviceMajor:0 DeviceMinor:993 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~projected/kube-api-access-c76ff DeviceMajor:0 DeviceMinor:227 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~projected/kube-api-access-wvpvs DeviceMajor:0 DeviceMinor:229 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~projected/kube-api-access-zb5zm DeviceMajor:0 DeviceMinor:235 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-756 DeviceMajor:0 DeviceMinor:756 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-832 DeviceMajor:0 DeviceMinor:832 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-913 DeviceMajor:0 DeviceMinor:913 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-901 DeviceMajor:0 DeviceMinor:901 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~projected/kube-api-access-9rkvj DeviceMajor:0 DeviceMinor:254 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/54dbed5af2f7a016c39f2c1ec9963c58ffc5eb61e9822c478a7070f705204697/userdata/shm DeviceMajor:0 DeviceMinor:490 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-774 DeviceMajor:0 DeviceMinor:774 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-319 DeviceMajor:0 DeviceMinor:319 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~projected/kube-api-access-5n27m DeviceMajor:0 DeviceMinor:773 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-897 DeviceMajor:0 DeviceMinor:897 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-138 DeviceMajor:0 DeviceMinor:138 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-194 DeviceMajor:0 DeviceMinor:194 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-527 DeviceMajor:0 DeviceMinor:527 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9/volumes/kubernetes.io~projected/kube-api-access-pwj77 DeviceMajor:0 DeviceMinor:780 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5567f1923dad84459fcc9068a666c7d7b21e33dc4f847dbb0c61779518830669/userdata/shm DeviceMajor:0 DeviceMinor:811 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~projected/kube-api-access-fpck7 DeviceMajor:0 DeviceMinor:734 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:43 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:465 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-354 DeviceMajor:0 DeviceMinor:354 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-674 DeviceMajor:0 DeviceMinor:674 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5446df8b-23d4-4bf3-84ac-d8e1d18813af/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:989 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/88596b62ed73d1cc0a657006e38bdd5646ef2e8ca1da1e67945f77115c8e4249/userdata/shm DeviceMajor:0 DeviceMinor:660 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/838330722d2b77da035705dec1c88bcde51f4cbc19b4ce09cd15f9636d1831b9/userdata/shm DeviceMajor:0 DeviceMinor:994 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:25257074688 Type:vfs Inodes:1048576 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9c4dd364a9a5bfd2e74f9430416a21555d78e909a4e0af3ab83914ee450d3acc/userdata/shm DeviceMajor:0 DeviceMinor:119 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/653792dc71e1738c52addebacdd959b3ac0bc6d0fd5e282587420f87400c0319/userdata/shm DeviceMajor:0 DeviceMinor:129 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-189 DeviceMajor:0 DeviceMinor:189 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/420c6d8f-6313-4d6c-b817-420797fc6878/volume-subpaths/run-systemd/ovnkube-controller/6 DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/node-tuning-operator-tls DeviceMajor:0 DeviceMinor:478 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~projected/kube-api-access-vmp5q DeviceMajor:0 DeviceMinor:620 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-158 DeviceMajor:0 DeviceMinor:158 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/41f511d18c601df3347c4a0ec791b96cc20cc186c323e16593ccc895c8986828/userdata/shm DeviceMajor:0 DeviceMinor:244 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-273 DeviceMajor:0 DeviceMinor:273 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-296 DeviceMajor:0 DeviceMinor:296 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f80b9c0c4a67b1a1a1e71289650eb2c4b55996a4c860501c30cc10920d663d48/userdata/shm DeviceMajor:0 DeviceMinor:421 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/970d4806b55e4555ffff42e4b3c89ee95e0a6b585519742e791fd49bb6cf6a08/userdata/shm DeviceMajor:0 DeviceMinor:493 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/59e2a2e1903d65790e4e245427eff740f03e2eb639cf0fb3b61443389c473dd4/userdata/shm DeviceMajor:0 DeviceMinor:670 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-951 DeviceMajor:0 DeviceMinor:951 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-56 DeviceMajor:0 DeviceMinor:56 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:209 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:219 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~secret/catalogserver-certs DeviceMajor:0 DeviceMinor:427 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-450 DeviceMajor:0 DeviceMinor:450 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-549 DeviceMajor:0 DeviceMinor:549 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-967 DeviceMajor:0 DeviceMinor:967 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-76 DeviceMajor:0 DeviceMinor:76 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ab2f6566-730d-46f5-92ed-79e3039d24e8/volumes/kubernetes.io~projected/kube-api-access-vjbmk DeviceMajor:0 DeviceMinor:260 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-703 DeviceMajor:0 DeviceMinor:703 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-953 DeviceMajor:0 DeviceMinor:953 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-998 DeviceMajor:0 DeviceMinor:998 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~projected/kube-api-access-t24zr DeviceMajor:0 DeviceMinor:223 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-282 DeviceMajor:0 DeviceMinor:282 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-508 DeviceMajor:0 DeviceMinor:508 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:518 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-537 DeviceMajor:0 DeviceMinor:537 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5ba8d02efd97ab96c66d2e5a8c58f04777b536ec1ff43d8a222b2f0642623996/userdata/shm DeviceMajor:0 DeviceMinor:886 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-905 DeviceMajor:0 DeviceMinor:905 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-918 DeviceMajor:0 DeviceMinor:918 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff/userdata/shm DeviceMajor:0 DeviceMinor:44 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:482 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~secret/package-server-manager-serving-cert DeviceMajor:0 DeviceMinor:474 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~projected/kube-api-access-vvzbm DeviceMajor:0 DeviceMinor:679 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-693 DeviceMajor:0 DeviceMinor:693 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-475 DeviceMajor:0 DeviceMinor:475 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0855fa1274661b8c6057731e20d0d7e2922bb3c8e15f7489343279f7ebb261de/userdata/shm DeviceMajor:0 DeviceMinor:268 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-643 DeviceMajor:0 DeviceMinor:643 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b9cb4848c544aa1c865b4801097eee547b05dfc57a09d5c556b7433efc862312/userdata/shm DeviceMajor:0 DeviceMinor:810 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f2f2e007b4a2d99fb4c65eb1e615e291749f7487d8ad5ab87f02a946335ae9ed/userdata/shm DeviceMajor:0 DeviceMinor:819 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-843 DeviceMajor:0 DeviceMinor:843 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-786 DeviceMajor:0 DeviceMinor:786 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582/userdata/shm DeviceMajor:0 DeviceMinor:41 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-150 DeviceMajor:0 DeviceMinor:150 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/7cefc3721be62a4748cdf65d432f7c4f7609bcf801065d7c8f2bec228cbb8187/userdata/shm DeviceMajor:0 DeviceMinor:350 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-352 DeviceMajor:0 DeviceMinor:352 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/511a78a0e5ad980beedfdb42193fee9b75ca4f21b1aa1969b03cf0ced5088a16/userdata/shm DeviceMajor:0 DeviceMinor:154 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1ca4880cca3c21e3d7b1cda1ce4ee79b5948da96d4adeaa90fb0268e490efa53/userdata/shm DeviceMajor:0 DeviceMinor:1016 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/1161350ae27bb5fa66a93e52971a8d01090c473cf7b64c507736b9667f58acfd/userdata/shm DeviceMajor:0 DeviceMinor:1029 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-106 DeviceMajor:0 DeviceMinor:106 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-367 DeviceMajor:0 DeviceMinor:367 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:366 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-401 DeviceMajor:0 DeviceMinor:401 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-971 DeviceMajor:0 DeviceMinor:971 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:1006 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-69 DeviceMajor:0 DeviceMinor:69 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-152 DeviceMajor:0 DeviceMinor:152 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f5c25a913e1399497bfa0861960ba8c967f50ac5018f8a2d5b9a421ec9681e9c/userdata/shm DeviceMajor:0 DeviceMinor:280 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7fa7b789-9201-493e-a96d-484a2622301a/volumes/kubernetes.io~projected/kube-api-access-5nnk5 DeviceMajor:0 DeviceMinor:349 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-504 DeviceMajor:0 DeviceMinor:504 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~secret/control-plane-machine-set-operator-tls DeviceMajor:0 DeviceMinor:760 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-121 DeviceMajor:0 DeviceMinor:121 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/27a66b1a1e6596ddb9cb3d1cc895f4b835b320b66695a73366512a5d14007017/userdata/shm DeviceMajor:0 DeviceMinor:477 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:284 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7f69a884-5fe8-4c03-8258-ff35396efc30/volumes/kubernetes.io~secret/proxy-tls DeviceMajor:0 DeviceMinor:720 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1031 DeviceMajor:0 DeviceMinor:1031 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/node-bootstrap-token DeviceMajor:0 DeviceMinor:1039 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-204 DeviceMajor:0 DeviceMinor:204 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-294 DeviceMajor:0 DeviceMinor:294 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-710 DeviceMajor:0 DeviceMinor:710 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~projected/kube-api-access-lzr66 DeviceMajor:0 DeviceMinor:680 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-758 DeviceMajor:0 DeviceMinor:758 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-408 DeviceMajor:0 DeviceMinor:408 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/b269ae2f-44ff-46c7-9039-21fca4a7a790/volumes/kubernetes.io~projected/kube-api-access-hx8ck DeviceMajor:0 DeviceMinor:99 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~projected/kube-api-access-4lbmm DeviceMajor:0 DeviceMinor:524 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-595 DeviceMajor:0 DeviceMinor:595 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-867 DeviceMajor:0 DeviceMinor:867 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-783 DeviceMajor:0 DeviceMinor:783 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-722 DeviceMajor:0 DeviceMinor:722 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e085120d4a5e0eb8137e18f80d6e36c83dd34577aa53b30526efc6bd45cb44e0/userdata/shm DeviceMajor:0 DeviceMinor:128 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/666475e5-df4b-44ef-a2d4-39d84ab91aad/volumes/kubernetes.io~projected/kube-api-access-w94dz DeviceMajor:0 DeviceMinor:267 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-300 DeviceMajor:0 DeviceMinor:300 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-535 DeviceMajor:0 DeviceMinor:535 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-712 DeviceMajor:0 DeviceMinor:712 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~projected/kube-api-access-drnv4 DeviceMajor:0 DeviceMinor:777 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-48 DeviceMajor:0 DeviceMinor:48 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:236 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/d8f5f93a07e934393b8425cfc890f9067c53c8f20de05125a9e4859ee33ee65d/userdata/shm DeviceMajor:0 DeviceMinor:248 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-303 DeviceMajor:0 DeviceMinor:303 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-86 DeviceMajor:0 DeviceMinor:86 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/39bc19add4a37ed516d807e6562400e75516e577ed9bf7289f6d2ef65017c8cb/userdata/shm DeviceMajor:0 DeviceMinor:790 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/volumes/kubernetes.io~projected/kube-api-access-6f9rq DeviceMajor:0 DeviceMinor:225 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:432 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-539 DeviceMajor:0 DeviceMinor:539 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-161 DeviceMajor:0 DeviceMinor:161 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/kube-api-access-wjtgs DeviceMajor:0 DeviceMinor:249 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a784a82dbf43a1c4004a5cc09c3b8c70da622a6ad91263a79de543269bb69473/userdata/shm DeviceMajor:0 DeviceMinor:334 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-632 DeviceMajor:0 DeviceMinor:632 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~secret/cluster-monitoring-operator-tls DeviceMajor:0 DeviceMinor:466 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a574f1a608b3163ddfe99b7017727c7a66f0c962198037c0d402a194cb014376/userdata/shm DeviceMajor:0 DeviceMinor:812 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-996 DeviceMajor:0 DeviceMinor:996 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-580 DeviceMajor:0 DeviceMinor:580 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-599 DeviceMajor:0 DeviceMinor:599 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5731d94226d26524a88cd0e1f020f55306937afa54c19184462a51a135d32f71/userdata/shm DeviceMajor:0 DeviceMinor:644 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/volumes/kubernetes.io~projected/kube-api-access-rp45l DeviceMajor:0 DeviceMinor:771 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-52 DeviceMajor:0 DeviceMinor:52 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/7f65054f-caf3-4cd3-889e-8d5a5376b1b8/volumes/kubernetes.io~projected/kube-api-access-2b28m DeviceMajor:0 DeviceMinor:798 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:140 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-195 DeviceMajor:0 DeviceMinor:195 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-325 DeviceMajor:0 DeviceMinor:325 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/290f6cf4-daa1-4cae-8e91-2411bf81f8b4/volumes/kubernetes.io~projected/kube-api-access-zjt7j DeviceMajor:0 DeviceMinor:433 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-110 DeviceMajor:0 DeviceMinor:110 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/651fac4a92992d28471be54ad32158ba1aad241805110f8bbba31e3953ad5abe/userdata/shm DeviceMajor:0 DeviceMinor:800 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/655b9f0a-cf27-443d-b0ea-3642dcae1ad2/volumes/kubernetes.io~projected/kube-api-access-7cz8d DeviceMajor:0 DeviceMinor:875 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-131 DeviceMajor:0 DeviceMinor:131 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8269652e-360f-43ef-9e7d-473c5f478275/volumes/kubernetes.io~secret/cluster-olm-operator-serving-cert DeviceMajor:0 DeviceMinor:214 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b88c5fbe-e19f-45b3-ab03-e1626f95776d/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:215 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/90d94cc33aea936207e84ea412990e0cb76bf40f08a71e787da0678b4b52c9e7/userdata/shm DeviceMajor:0 DeviceMinor:242 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6a3ab252f2a6606dc25ac128723313dc899ecdc469d39e05f29cbcf092da5942/userdata/shm DeviceMajor:0 DeviceMinor:486 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-541 DeviceMajor:0 DeviceMinor:541 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-958 DeviceMajor:0 DeviceMinor:958 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:230 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a9d64cd1-bd5b-4fbc-972b-000a03c854fe/volumes/kubernetes.io~projected/kube-api-access-zbz9p DeviceMajor:0 DeviceMinor:247 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/15270349-f3aa-43bc-88a8-f0fff3aa2528/volumes/kubernetes.io~projected/kube-api-access-qwzgb DeviceMajor:0 DeviceMinor:302 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-444 DeviceMajor:0 DeviceMinor:444 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-92 DeviceMajor:0 DeviceMinor:92 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/29e340b2b6b88ee1d2fe3338e7cc62956472917066207c7d22fcd11ca5797fe0/userdata/shm DeviceMajor:0 DeviceMinor:309 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-425 DeviceMajor:0 DeviceMinor:425 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-545 DeviceMajor:0 DeviceMinor:545 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-846 DeviceMajor:0 DeviceMinor:846 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/webhook-cert DeviceMajor:0 DeviceMinor:876 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc/userdata/shm DeviceMajor:0 DeviceMinor:84 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f8980370-267c-4168-ba97-d780698533ff/volumes/kubernetes.io~projected/kube-api-access-kjhvg DeviceMajor:0 DeviceMinor:94 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a909da7184e68b325f6b02ea8c22a89a391e1bf4dc2d8cf49493f2dee5e4e767/userdata/shm DeviceMajor:0 DeviceMinor:148 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/2a4e91956e6af4d37253ed844488126f5600b96517ef3a0ce7d67e4b637437bf/userdata/shm DeviceMajor:0 DeviceMinor:436 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:484 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-406 DeviceMajor:0 DeviceMinor:406 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~secret/certs DeviceMajor:0 DeviceMinor:1038 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/599c055c-3517-46cb-b584-0050b12a7dea/volumes/kubernetes.io~projected/kube-api-access-6bqlq DeviceMajor:0 DeviceMinor:1040 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1065 DeviceMajor:0 DeviceMinor:1065 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~projected/kube-api-access-6qskh DeviceMajor:0 DeviceMinor:231 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f8d1302e8231065b5ce889fa97297564a1cfcbc3ee62847cce92e43384f3a740/userdata/shm DeviceMajor:0 DeviceMinor:262 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-423 DeviceMajor:0 DeviceMinor:423 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:485 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-501 DeviceMajor:0 DeviceMinor:501 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/46d1b044-16fb-4442-a554-6b15a8a1c8ae/volumes/kubernetes.io~secret/machine-api-operator-tls DeviceMajor:0 DeviceMinor:729 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~projected/kube-api-access-l2w44 DeviceMajor:0 DeviceMinor:246 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/ca-certs DeviceMajor:0 DeviceMinor:434 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~secret/cloud-credential-operator-serving-cert DeviceMajor:0 DeviceMinor:794 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-663 DeviceMajor:0 DeviceMinor:663 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-862 DeviceMajor:0 DeviceMinor:862 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/bf99662680409a7aa806c014bae5b66c40427c61c312090f66a2311a2f39a24c/userdata/shm DeviceMajor:0 DeviceMinor:100 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~projected/kube-api-access-4h4st DeviceMajor:0 DeviceMinor:125 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-134 DeviceMajor:0 DeviceMinor:134 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~projected/kube-api-access-lng9v DeviceMajor:0 DeviceMinor:241 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-454 DeviceMajor:0 DeviceMinor:454 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9f7067a0c3d41100d0e0d6087ce95108117f8991ef8fa09df76a789ed7b78689/userdata/shm DeviceMajor:0 DeviceMinor:626 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/fe67bfc50554c3c039f940d887faf411984b747c8be2377d1eb15383b70de1a2/userdata/shm DeviceMajor:0 DeviceMinor:491 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ca25117a-ccd5-4628-8342-e277bb7be0e2/volumes/kubernetes.io~secret/cloud-controller-manager-operator-tls DeviceMajor:0 DeviceMinor:960 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1000 DeviceMajor:0 DeviceMinor:1000 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e3fe386a-dea8-484a-b95a-0f3f475b1f82/volumes/kubernetes.io~secret/machine-approver-tls DeviceMajor:0 DeviceMinor:733 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/21e44c55e16841087847adbebb0bb6c58ea019050056419446d9a85cc4d4d496/userdata/shm DeviceMajor:0 DeviceMinor:808 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-658 DeviceMajor:0 DeviceMinor:658 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-853 DeviceMajor:0 DeviceMinor:853 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:25257070592 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:10102829056 Type:vfs Inodes:819200 HasInodes:true} {Device:overlay_0-136 DeviceMajor:0 DeviceMinor:136 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/e946a5469a45f458f7f3463d40633d8f93666f0c7a05ec65f3cba4034066232a/userdata/shm DeviceMajor:0 DeviceMinor:252 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:266 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-514 DeviceMajor:0 DeviceMinor:514 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/ca25117a-ccd5-4628-8342-e277bb7be0e2/volumes/kubernetes.io~projected/kube-api-access-9kgkz DeviceMajor:0 DeviceMinor:964 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-452 DeviceMajor:0 DeviceMinor:452 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-714 DeviceMajor:0 DeviceMinor:714 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/9515e34b-addf-487a-adf8-c6ef24fcc54c/volumes/kubernetes.io~secret/tls-certificates DeviceMajor:0 DeviceMinor:1024 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-719 DeviceMajor:0 DeviceMinor:719 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cd1527a85e67a940e1a665766f8151604f1a3561383f05583571a1de53c19960/userdata/shm DeviceMajor:0 DeviceMinor:255 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~projected/kube-api-access-f8mm9 DeviceMajor:0 DeviceMinor:776 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-736 DeviceMajor:0 DeviceMinor:736 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6d5765e6-80cc-404b-b375-c109febd1843/volumes/kubernetes.io~projected/kube-api-access-8wps6 DeviceMajor:0 DeviceMinor:1012 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-903 DeviceMajor:0 DeviceMinor:903 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-707 DeviceMajor:0 DeviceMinor:707 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/29624e4f-d970-4dfa-a8f1-515b73397c8f/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:221 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/449f1ddce65bd4d442100d7cd54f76263e409bc0a5c1725b8cdf399ad8c9c8ba/userdata/shm DeviceMajor:0 DeviceMinor:487 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:523 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-618 DeviceMajor:0 DeviceMinor:618 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/fc392945-53ad-473c-8803-70e2026712d2/volumes/kubernetes.io~secret/marketplace-operator-metrics DeviceMajor:0 DeviceMinor:471 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-840 DeviceMajor:0 DeviceMinor:840 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-102 DeviceMajor:0 DeviceMinor:102 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6c378420390e063f3c4cddc0e89f10a0145ae465bce9d9966380956d1429a7da/userdata/shm DeviceMajor:0 DeviceMinor:104 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-313 DeviceMajor:0 DeviceMinor:313 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/2369ce94-237f-41ad-9875-173578764483/volumes/kubernetes.io~projected/kube-api-access-4ds84 DeviceMajor:0 DeviceMinor:420 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:522 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes/kubernetes.io~secret/webhook-certs DeviceMajor:0 DeviceMinor:470 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-112 DeviceMajor:0 DeviceMinor:112 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:226 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/abfb5602-7255-43d7-a510-e7f94885887e/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:237 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/61a9fce6-50e1-413c-9ec0-177d6e903bdd/volumes/kubernetes.io~projected/kube-api-access-jbggb DeviceMajor:0 DeviceMinor:257 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-369 DeviceMajor:0 DeviceMinor:369 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f8c93e0d-54e5-4c80-9d69-a70317baeacf/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:483 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-708 DeviceMajor:0 DeviceMinor:708 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/27b149f7-6aff-45f3-b935-e65279f2f9ee/volumes/kubernetes.io~projected/kube-api-access-f72ps DeviceMajor:0 DeviceMinor:141 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-298 DeviceMajor:0 DeviceMinor:298 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/69851821-e1fc-44a8-98df-0cfe9d564126/volumes/kubernetes.io~secret/srv-cert DeviceMajor:0 DeviceMinor:472 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9a3242defcab78a5704c3ac516165c6355f42a0842d58543e6938dbfa54c0dc4/userdata/shm DeviceMajor:0 DeviceMinor:748 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~secret/samples-operator-tls DeviceMajor:0 DeviceMinor:792 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1022 DeviceMajor:0 DeviceMinor:1022 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-168 DeviceMajor:0 DeviceMinor:168 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-665 DeviceMajor:0 DeviceMinor:665 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-163 DeviceMajor:0 DeviceMinor:163 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~projected/kube-api-access-ktjs9 DeviceMajor:0 DeviceMinor:799 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-827 DeviceMajor:0 DeviceMinor:827 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~projected/kube-api-access-tq99k DeviceMajor:0 DeviceMinor:882 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-45 DeviceMajor:0 DeviceMinor:45 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~projected/kube-api-access-dgwj6 DeviceMajor:0 DeviceMinor:123 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/7d462ed3-d191-42a5-b8e0-79ab9af13991/volumes/kubernetes.io~secret/etcd-client DeviceMajor:0 DeviceMinor:521 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78/volumes/kubernetes.io~projected/kube-api-access-d9mmg DeviceMajor:0 DeviceMinor:797 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-976 DeviceMajor:0 DeviceMinor:976 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1020 DeviceMajor:0 DeviceMinor:1020 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/cdde49fab8a3c629c252f1f7390a41b3c48bf77cd72b2434083e80efd11766cc/userdata/shm DeviceMajor:0 DeviceMinor:804 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174/userdata/shm DeviceMajor:0 DeviceMinor:85 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602/volumes/kubernetes.io~projected/kube-api-access-gnnlw DeviceMajor:0 DeviceMinor:251 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~secret/cluster-baremetal-operator-tls DeviceMajor:0 DeviceMinor:480 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/tmp DeviceMajor:0 DeviceMinor:615 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:98 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-705 DeviceMajor:0 DeviceMinor:705 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-761 DeviceMajor:0 DeviceMinor:761 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-698 DeviceMajor:0 DeviceMinor:698 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~projected/bound-sa-token DeviceMajor:0 DeviceMinor:232 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/9e495235becad119aa39d722482114d64ceca8622cb68745ac85876c90e3baab/userdata/shm DeviceMajor:0 DeviceMinor:818 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/b91d2847ef2fd4a9afd46d414fef3fab6d77e51105ef982de75396cd9b632974/userdata/shm DeviceMajor:0 DeviceMinor:822 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3caff2c1-f178-4e16-916d-27ccf178ff37/volumes/kubernetes.io~projected/kube-api-access-2j2bf DeviceMajor:0 DeviceMinor:115 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-160 DeviceMajor:0 DeviceMinor:160 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-603 DeviceMajor:0 DeviceMinor:603 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-498 DeviceMajor:0 DeviceMinor:498 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-65 DeviceMajor:0 DeviceMinor:65 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-547 DeviceMajor:0 DeviceMinor:547 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~projected/kube-api-access-p2tvr DeviceMajor:0 DeviceMinor:233 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/6fd04d07fa9a3fb32805e8c1045c690ba26c7813c9c939367cc05dfe1bd099ee/userdata/shm DeviceMajor:0 DeviceMinor:668 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/b12701eb-4226-4f9c-9398-ad0c3fea7451/volumes/kubernetes.io~secret/cert DeviceMajor:0 DeviceMinor:731 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-838 DeviceMajor:0 DeviceMinor:838 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~secret/default-certificate DeviceMajor:0 DeviceMinor:1011 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/8512 Mar 07 21:17:57.119103 master-0 kubenswrapper[16352]: a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~projected/kube-api-access-fqtbf DeviceMajor:0 DeviceMinor:735 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1045 DeviceMajor:0 DeviceMinor:1045 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e543d99f-e0dc-49be-95bd-c39eabd05ce8/volumes/kubernetes.io~projected/kube-api-access-dsspm DeviceMajor:0 DeviceMinor:234 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8b09916c2044187ea8d347012e6a895af9b16d05aa54854fd5bef01122aeb601/userdata/shm DeviceMajor:0 DeviceMinor:238 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/f2ca65f5-7dbe-4407-b38e-713592f62136/volumes/kubernetes.io~projected/kube-api-access-fs7nz DeviceMajor:0 DeviceMinor:625 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-502 DeviceMajor:0 DeviceMinor:502 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-96 DeviceMajor:0 DeviceMinor:96 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-412 DeviceMajor:0 DeviceMinor:412 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8ec6f338d22c639a620f442f8c4c1b118ba292e32b27dd86d02fc64df14c1372/userdata/shm DeviceMajor:0 DeviceMinor:272 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-410 DeviceMajor:0 DeviceMinor:410 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-969 DeviceMajor:0 DeviceMinor:969 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-184 DeviceMajor:0 DeviceMinor:184 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-288 DeviceMajor:0 DeviceMinor:288 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/183a5212-1b21-44e4-9ed5-2f63f76e652e/volumes/kubernetes.io~projected/kube-api-access-2jcxp DeviceMajor:0 DeviceMinor:435 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2/volumes/kubernetes.io~secret/image-registry-operator-tls DeviceMajor:0 DeviceMinor:479 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1018 DeviceMajor:0 DeviceMinor:1018 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1043 DeviceMajor:0 DeviceMinor:1043 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/112a83bbfd7da68fd7d98c9912932beebde7c37fe463c6524a512ede7b50dc89/userdata/shm DeviceMajor:0 DeviceMinor:883 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-899 DeviceMajor:0 DeviceMinor:899 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-270 DeviceMajor:0 DeviceMinor:270 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-716 DeviceMajor:0 DeviceMinor:716 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-690 DeviceMajor:0 DeviceMinor:690 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-399 DeviceMajor:0 DeviceMinor:399 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/f08edf29-c53f-452d-880b-e8ce27b05b6f/volumes/kubernetes.io~projected/kube-api-access-hqxlr DeviceMajor:0 DeviceMinor:772 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c7a270720447e0a61bb1c8ec80a8415d28e52795162c44c7229c8de5a130a13d/userdata/shm DeviceMajor:0 DeviceMinor:802 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1033 DeviceMajor:0 DeviceMinor:1033 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/24f69689-ff12-4786-af05-61429e9eadf8/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:217 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-442 DeviceMajor:0 DeviceMinor:442 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~secret/encryption-config DeviceMajor:0 DeviceMinor:371 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-893 DeviceMajor:0 DeviceMinor:893 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-829 DeviceMajor:0 DeviceMinor:829 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/5006ce201ad3cd74c89114726a54a453e54bfc124c5704a26b3fa400b0f6b877/userdata/shm DeviceMajor:0 DeviceMinor:438 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-634 DeviceMajor:0 DeviceMinor:634 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-836 DeviceMajor:0 DeviceMinor:836 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-841 DeviceMajor:0 DeviceMinor:841 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-71 DeviceMajor:0 DeviceMinor:71 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d/volumes/kubernetes.io~secret/apiservice-cert DeviceMajor:0 DeviceMinor:881 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:overlay_0-323 DeviceMajor:0 DeviceMinor:323 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-360 DeviceMajor:0 DeviceMinor:360 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-561 DeviceMajor:0 DeviceMinor:561 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-597 DeviceMajor:0 DeviceMinor:597 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/4fbc6f245e73f966c864542a880588442bd18586a3e7854a57473032e1f7135f/userdata/shm DeviceMajor:0 DeviceMinor:628 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-156 DeviceMajor:0 DeviceMinor:156 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/96cfa9d3-fc26-42e9-8bac-ff2c25223654/volumes/kubernetes.io~projected/kube-api-access DeviceMajor:0 DeviceMinor:481 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/d50f92ea-1c78-4535-a14c-96b00f2cf377/volumes/kubernetes.io~projected/kube-api-access-jpjms DeviceMajor:0 DeviceMinor:1013 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/59fb206093956750cd2b0971ba9daf6182e197e8af3331245cd46cb229bb1de1/userdata/shm DeviceMajor:0 DeviceMinor:746 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-824 DeviceMajor:0 DeviceMinor:824 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-179 DeviceMajor:0 DeviceMinor:179 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/5f82d4aa-0cb5-477f-944e-745a21d124fc/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:222 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/47ecf172-666e-4360-97ff-bd9dbccc1fd6/volumes/kubernetes.io~projected/kube-api-access-f748l DeviceMajor:0 DeviceMinor:224 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/aa4738248c68a5f24174bfee8718356f164d810da170930b0082bb8c35862648/userdata/shm DeviceMajor:0 DeviceMinor:258 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/a61a736a-66e5-4ca1-a8a7-088cf73cfcce/volumes/kubernetes.io~projected/kube-api-access-rxkw8 DeviceMajor:0 DeviceMinor:265 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/4e94f64e-4a89-4d9d-acbd-80f86bf2f964/volumes/kubernetes.io~secret/metrics-tls DeviceMajor:0 DeviceMinor:621 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-724 DeviceMajor:0 DeviceMinor:724 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61/userdata/shm DeviceMajor:0 DeviceMinor:54 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bd633b72-3d0b-4601-a2c2-3f487d943b35/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:213 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021/volumes/kubernetes.io~projected/kube-api-access-9mzlv DeviceMajor:0 DeviceMinor:795 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-859 DeviceMajor:0 DeviceMinor:859 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-885 DeviceMajor:0 DeviceMinor:885 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e720291b-0f96-4ebb-80f2-5df7cb194ffc/volumes/kubernetes.io~projected/kube-api-access-65pgv DeviceMajor:0 DeviceMinor:228 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05/volumes/kubernetes.io~empty-dir/etc-tuned DeviceMajor:0 DeviceMinor:616 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25/userdata/shm DeviceMajor:0 DeviceMinor:667 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-701 DeviceMajor:0 DeviceMinor:701 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-50 DeviceMajor:0 DeviceMinor:50 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-404 DeviceMajor:0 DeviceMinor:404 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-809 DeviceMajor:0 DeviceMinor:809 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/0d6ce5c921f4c23cf75893970c3672194512ea3e6e2c3df0b77494942ff24a81/userdata/shm DeviceMajor:0 DeviceMinor:1014 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/5b339e6a-cae6-416a-963b-2fd23cecba96/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:218 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/08b2cad01a6764dea466b4d09a0ce4a46e5768814c3b06943f47325ad12f6a84/userdata/shm DeviceMajor:0 DeviceMinor:261 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-292 DeviceMajor:0 DeviceMinor:292 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-566 DeviceMajor:0 DeviceMinor:566 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/dd310b71-6c79-4169-8b8a-7b3fe35a97fd/volumes/kubernetes.io~secret/metrics-certs DeviceMajor:0 DeviceMinor:464 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-750 DeviceMajor:0 DeviceMinor:750 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/a976805a261b43c3cbc596829459288339bb9f57afae203909e8153931024f4e/userdata/shm DeviceMajor:0 DeviceMinor:1041 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-1047 DeviceMajor:0 DeviceMinor:1047 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/831064ad19912357852f314d15373db7b732cf6bf4313483f541003aef1dbf06/userdata/shm DeviceMajor:0 DeviceMinor:276 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-328 DeviceMajor:0 DeviceMinor:328 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9/volumes/kubernetes.io~projected/kube-api-access-69jxd DeviceMajor:0 DeviceMinor:473 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-570 DeviceMajor:0 DeviceMinor:570 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/34784b03a50881cb9335e39c23d5c919024887bcefedfbd739f247046659eb16/userdata/shm DeviceMajor:0 DeviceMinor:552 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-855 DeviceMajor:0 DeviceMinor:855 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-1060 DeviceMajor:0 DeviceMinor:1060 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-277 DeviceMajor:0 DeviceMinor:277 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-290 DeviceMajor:0 DeviceMinor:290 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-506 DeviceMajor:0 DeviceMinor:506 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-630 DeviceMajor:0 DeviceMinor:630 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c43f11af2c8b842060c66a8968b08b62d92a450aa814f560f58b0b7108694635/userdata/shm DeviceMajor:0 DeviceMinor:494 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-806 DeviceMajor:0 DeviceMinor:806 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-440 DeviceMajor:0 DeviceMinor:440 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:565 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6/userdata/shm DeviceMajor:0 DeviceMinor:341 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-414 DeviceMajor:0 DeviceMinor:414 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/8512a7f6-889f-483e-960f-1ce3c834e92c/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:730 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-387 DeviceMajor:0 DeviceMinor:387 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/bd9cf577-3c49-417b-a6c0-9d307c113221/volumes/kubernetes.io~secret/cluster-storage-operator-serving-cert DeviceMajor:0 DeviceMinor:791 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/46548c2c-6a8a-4382-87de-2c7a8442a33c/volumes/kubernetes.io~secret/ovn-control-plane-metrics-cert DeviceMajor:0 DeviceMinor:124 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/var/lib/kubelet/pods/3faedef9-d507-48aa-82a8-f3dc9b5adeef/volumes/kubernetes.io~secret/serving-cert DeviceMajor:0 DeviceMinor:220 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-311 DeviceMajor:0 DeviceMinor:311 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:/var/lib/kubelet/pods/e38fc940-e59a-45ff-978b-fdcdc534a2a5/volumes/kubernetes.io~projected/kube-api-access-2zppz DeviceMajor:0 DeviceMinor:326 Capacity:49335545856 Type:vfs Inodes:6166277 HasInodes:true} {Device:/run/containers/storage/overlay-containers/c098327f700751fe6a38c107559ad8b2a80af9c9060aa16b67a2b7a48e44faad/userdata/shm DeviceMajor:0 DeviceMinor:529 Capacity:67108864 Type:vfs Inodes:6166277 HasInodes:true} {Device:overlay_0-517 DeviceMajor:0 DeviceMinor:517 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-865 DeviceMajor:0 DeviceMinor:865 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true} {Device:overlay_0-869 DeviceMajor:0 DeviceMinor:869 Capacity:214143315968 Type:vfs Inodes:104594880 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none} 252:16:{Name:vdb Major:252 Minor:16 Size:21474836480 Scheduler:none} 252:32:{Name:vdc Major:252 Minor:32 Size:21474836480 Scheduler:none} 252:48:{Name:vdd Major:252 Minor:48 Size:21474836480 Scheduler:none} 252:64:{Name:vde Major:252 Minor:64 Size:21474836480 Scheduler:none}] NetworkDevices:[{Name:0855fa1274661b8 MacAddress:aa:f3:26:91:81:33 Speed:10000 Mtu:1350} {Name:08b2cad01a6764d MacAddress:e2:a6:ee:16:32:b3 Speed:10000 Mtu:1350} {Name:0d6ce5c921f4c23 MacAddress:82:66:fc:df:e6:94 Speed:10000 Mtu:1350} {Name:1161350ae27bb5f MacAddress:d6:16:2d:bf:8c:d4 Speed:10000 Mtu:1350} {Name:27a66b1a1e6596d MacAddress:9e:0f:e5:a6:9f:e1 Speed:10000 Mtu:1350} {Name:29e340b2b6b88ee MacAddress:26:61:6c:2c:15:ff Speed:10000 Mtu:1350} {Name:2a4e91956e6af4d MacAddress:e2:f7:8b:a0:98:72 Speed:10000 Mtu:1350} {Name:34784b03a50881c MacAddress:4e:56:aa:38:f6:0d Speed:10000 Mtu:1350} {Name:39bc19add4a37ed MacAddress:ea:13:27:e6:4e:0f Speed:10000 Mtu:1350} {Name:41f511d18c601df MacAddress:36:aa:cf:78:09:d0 Speed:10000 Mtu:1350} {Name:449f1ddce65bd4d MacAddress:de:f6:6a:45:0e:ea Speed:10000 Mtu:1350} {Name:5006ce201ad3cd7 MacAddress:a6:df:b2:34:fa:84 Speed:10000 Mtu:1350} {Name:511a78a0e5ad980 MacAddress:ae:d7:5a:25:39:28 Speed:10000 Mtu:1350} {Name:54a20e1f511152c MacAddress:06:78:36:a8:2b:94 Speed:10000 Mtu:1350} {Name:54dbed5af2f7a01 MacAddress:4e:9d:f2:d1:4d:17 Speed:10000 Mtu:1350} {Name:5567f1923dad844 MacAddress:46:af:7c:03:9b:b3 Speed:10000 Mtu:1350} {Name:5731d94226d2652 MacAddress:56:ee:6e:13:d8:a7 Speed:10000 Mtu:1350} {Name:57c6ee9a56cc57d MacAddress:32:96:dc:33:b6:a7 Speed:10000 Mtu:1350} {Name:59e2a2e1903d657 MacAddress:fa:a3:56:6d:3e:93 Speed:10000 Mtu:1350} {Name:59fb20609395675 MacAddress:7e:a5:a7:ce:75:69 Speed:10000 Mtu:1350} {Name:5ba8d02efd97ab9 MacAddress:42:ff:b9:c1:cf:d4 Speed:10000 Mtu:1350} {Name:651fac4a92992d2 MacAddress:a2:0c:25:43:6b:47 Speed:10000 Mtu:1350} {Name:6a3ab252f2a6606 MacAddress:de:09:37:9f:c5:de Speed:10000 Mtu:1350} {Name:6fd04d07fa9a3fb MacAddress:5e:86:d7:54:7b:c8 Speed:10000 Mtu:1350} {Name:7ac62432ddbefa8 MacAddress:6e:c7:95:2e:35:69 Speed:10000 Mtu:1350} {Name:7cefc3721be62a4 MacAddress:62:fa:06:ee:91:18 Speed:10000 Mtu:1350} {Name:831064ad1991235 MacAddress:26:4c:e2:ea:57:99 Speed:10000 Mtu:1350} {Name:838330722d2b77d MacAddress:da:30:a9:56:b0:d1 Speed:10000 Mtu:1350} {Name:88596b62ed73d1c MacAddress:b6:82:91:c4:51:38 Speed:10000 Mtu:1350} {Name:8b09916c2044187 MacAddress:92:93:f6:b3:67:43 Speed:10000 Mtu:1350} {Name:8ec6f338d22c639 MacAddress:b6:5e:82:b8:70:be Speed:10000 Mtu:1350} {Name:90d94cc33aea936 MacAddress:aa:18:2c:a8:b1:67 Speed:10000 Mtu:1350} {Name:970d4806b55e455 MacAddress:4e:79:79:1a:5f:0b Speed:10000 Mtu:1350} {Name:9a3242defcab78a MacAddress:3e:0f:59:f7:2f:f9 Speed:10000 Mtu:1350} {Name:9e495235becad11 MacAddress:5a:e8:d1:df:14:aa Speed:10000 Mtu:1350} {Name:9f7067a0c3d4110 MacAddress:22:a6:86:97:56:f4 Speed:10000 Mtu:1350} {Name:a574f1a608b3163 MacAddress:ae:75:d2:ba:1c:1f Speed:10000 Mtu:1350} {Name:a784a82dbf43a1c MacAddress:b2:5a:f6:e1:ef:42 Speed:10000 Mtu:1350} {Name:aa4738248c68a5f MacAddress:42:b6:8a:0d:ee:bf Speed:10000 Mtu:1350} {Name:b91d2847ef2fd4a MacAddress:be:19:01:be:e6:4c Speed:10000 Mtu:1350} {Name:b9cb4848c544aa1 MacAddress:92:f8:78:b6:71:dd Speed:10000 Mtu:1350} {Name:br-ex MacAddress:fa:16:9e:81:f6:10 Speed:0 Mtu:1450} {Name:br-int MacAddress:aa:bb:22:b4:42:aa Speed:0 Mtu:1350} {Name:c098327f700751f MacAddress:f6:60:f6:5d:84:b2 Speed:10000 Mtu:1350} {Name:c7a270720447e0a MacAddress:c2:c6:81:c1:82:8f Speed:10000 Mtu:1350} {Name:cd1527a85e67a94 MacAddress:1e:12:06:a3:b7:cb Speed:10000 Mtu:1350} {Name:cdde49fab8a3c62 MacAddress:16:2e:6e:c2:1b:43 Speed:10000 Mtu:1350} {Name:d8f5f93a07e9343 MacAddress:52:d1:a0:fc:aa:8d Speed:10000 Mtu:1350} {Name:e946a5469a45f45 MacAddress:0e:4d:d7:9c:44:d1 Speed:10000 Mtu:1350} {Name:eth0 MacAddress:fa:16:9e:81:f6:10 Speed:-1 Mtu:1450} {Name:eth1 MacAddress:fa:16:3e:6e:e9:7d Speed:-1 Mtu:1450} {Name:eth2 MacAddress:fa:16:3e:38:b1:02 Speed:-1 Mtu:1450} {Name:f2f2e007b4a2d99 MacAddress:ea:59:2a:05:ab:f1 Speed:10000 Mtu:1350} {Name:f80b9c0c4a67b1a MacAddress:96:85:55:ef:01:f4 Speed:10000 Mtu:1350} {Name:f8d1302e8231065 MacAddress:22:6a:0f:2d:3e:20 Speed:10000 Mtu:1350} {Name:fe67bfc50554c3c MacAddress:56:a7:9d:bd:27:e7 Speed:10000 Mtu:1350} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:80:00:02 Speed:0 Mtu:1350} {Name:ovs-system MacAddress:e6:01:79:57:35:2d Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:50514145280 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[12] Caches:[{Id:12 Size:32768 Type:Data Level:1} {Id:12 Size:32768 Type:Instruction Level:1} {Id:12 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:12 Size:16777216 Type:Unified Level:3}] SocketID:12 BookID: DrawerID:} {Id:0 Threads:[13] Caches:[{Id:13 Size:32768 Type:Data Level:1} {Id:13 Size:32768 Type:Instruction Level:1} {Id:13 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:13 Size:16777216 Type:Unified Level:3}] SocketID:13 BookID: DrawerID:} {Id:0 Threads:[14] Caches:[{Id:14 Size:32768 Type:Data Level:1} {Id:14 Size:32768 Type:Instruction Level:1} {Id:14 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:14 Size:16777216 Type:Unified Level:3}] SocketID:14 BookID: DrawerID:} {Id:0 Threads:[15] Caches:[{Id:15 Size:32768 Type:Data Level:1} {Id:15 Size:32768 Type:Instruction Level:1} {Id:15 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:15 Size:16777216 Type:Unified Level:3}] SocketID:15 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 07 21:17:57.119103 master-0 kubenswrapper[16352]: I0307 21:17:57.118132 16352 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 07 21:17:57.119103 master-0 kubenswrapper[16352]: I0307 21:17:57.118370 16352 manager.go:233] Version: {KernelVersion:5.14.0-427.111.1.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202602172219-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 07 21:17:57.119103 master-0 kubenswrapper[16352]: I0307 21:17:57.118914 16352 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 07 21:17:57.119973 master-0 kubenswrapper[16352]: I0307 21:17:57.119883 16352 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 07 21:17:57.120325 master-0 kubenswrapper[16352]: I0307 21:17:57.119968 16352 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"master-0","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 07 21:17:57.120396 master-0 kubenswrapper[16352]: I0307 21:17:57.120362 16352 topology_manager.go:138] "Creating topology manager with none policy" Mar 07 21:17:57.120396 master-0 kubenswrapper[16352]: I0307 21:17:57.120379 16352 container_manager_linux.go:303] "Creating device plugin manager" Mar 07 21:17:57.120470 master-0 kubenswrapper[16352]: I0307 21:17:57.120393 16352 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:17:57.120470 master-0 kubenswrapper[16352]: I0307 21:17:57.120432 16352 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 07 21:17:57.120538 master-0 kubenswrapper[16352]: I0307 21:17:57.120519 16352 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:17:57.120691 master-0 kubenswrapper[16352]: I0307 21:17:57.120654 16352 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 07 21:17:57.120791 master-0 kubenswrapper[16352]: I0307 21:17:57.120768 16352 kubelet.go:418] "Attempting to sync node with API server" Mar 07 21:17:57.120846 master-0 kubenswrapper[16352]: I0307 21:17:57.120795 16352 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 07 21:17:57.120846 master-0 kubenswrapper[16352]: I0307 21:17:57.120819 16352 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 07 21:17:57.120846 master-0 kubenswrapper[16352]: I0307 21:17:57.120843 16352 kubelet.go:324] "Adding apiserver pod source" Mar 07 21:17:57.120957 master-0 kubenswrapper[16352]: I0307 21:17:57.120863 16352 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 07 21:17:57.123276 master-0 kubenswrapper[16352]: I0307 21:17:57.123216 16352 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.13-8.rhaos4.18.gitd78977c.el9" apiVersion="v1" Mar 07 21:17:57.126936 master-0 kubenswrapper[16352]: I0307 21:17:57.126902 16352 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 07 21:17:57.127349 master-0 kubenswrapper[16352]: I0307 21:17:57.127319 16352 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127486 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127515 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127522 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127529 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127537 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127545 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127552 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 07 21:17:57.127548 master-0 kubenswrapper[16352]: I0307 21:17:57.127559 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 07 21:17:57.127882 master-0 kubenswrapper[16352]: I0307 21:17:57.127570 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 07 21:17:57.127882 master-0 kubenswrapper[16352]: I0307 21:17:57.127578 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 07 21:17:57.127882 master-0 kubenswrapper[16352]: I0307 21:17:57.127592 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 07 21:17:57.127882 master-0 kubenswrapper[16352]: I0307 21:17:57.127606 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 07 21:17:57.127882 master-0 kubenswrapper[16352]: I0307 21:17:57.127644 16352 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 07 21:17:57.128249 master-0 kubenswrapper[16352]: I0307 21:17:57.128214 16352 server.go:1280] "Started kubelet" Mar 07 21:17:57.128408 master-0 kubenswrapper[16352]: I0307 21:17:57.128356 16352 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 07 21:17:57.129039 master-0 kubenswrapper[16352]: I0307 21:17:57.128985 16352 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 21:17:57.129284 master-0 kubenswrapper[16352]: I0307 21:17:57.129012 16352 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 07 21:17:57.129284 master-0 kubenswrapper[16352]: I0307 21:17:57.129197 16352 server_v1.go:47] "podresources" method="list" useActivePods=true Mar 07 21:17:57.130066 master-0 kubenswrapper[16352]: I0307 21:17:57.130037 16352 server.go:449] "Adding debug handlers to kubelet server" Mar 07 21:17:57.130802 master-0 systemd[1]: Started Kubernetes Kubelet. Mar 07 21:17:57.131016 master-0 kubenswrapper[16352]: I0307 21:17:57.130941 16352 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 07 21:17:57.144082 master-0 kubenswrapper[16352]: I0307 21:17:57.144017 16352 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 21:17:57.148792 master-0 kubenswrapper[16352]: I0307 21:17:57.148729 16352 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 07 21:17:57.148792 master-0 kubenswrapper[16352]: I0307 21:17:57.148793 16352 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 07 21:17:57.149115 master-0 kubenswrapper[16352]: I0307 21:17:57.148969 16352 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 07 21:17:57.149115 master-0 kubenswrapper[16352]: I0307 21:17:57.148998 16352 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 07 21:17:57.149115 master-0 kubenswrapper[16352]: I0307 21:17:57.148943 16352 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-03-08 21:04:42 +0000 UTC, rotation deadline is 2026-03-08 15:31:34.607665316 +0000 UTC Mar 07 21:17:57.149115 master-0 kubenswrapper[16352]: I0307 21:17:57.149065 16352 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 18h13m37.458603178s for next certificate rotation Mar 07 21:17:57.149115 master-0 kubenswrapper[16352]: I0307 21:17:57.149017 16352 desired_state_of_world_populator.go:147] "Desired state populator starts to run" Mar 07 21:17:57.151957 master-0 kubenswrapper[16352]: E0307 21:17:57.151764 16352 kubelet.go:1495] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Mar 07 21:17:57.152440 master-0 kubenswrapper[16352]: I0307 21:17:57.152412 16352 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 21:17:57.157089 master-0 kubenswrapper[16352]: I0307 21:17:57.156967 16352 factory.go:55] Registering systemd factory Mar 07 21:17:57.157159 master-0 kubenswrapper[16352]: I0307 21:17:57.157107 16352 factory.go:221] Registration of the systemd container factory successfully Mar 07 21:17:57.157550 master-0 kubenswrapper[16352]: I0307 21:17:57.157515 16352 factory.go:153] Registering CRI-O factory Mar 07 21:17:57.157602 master-0 kubenswrapper[16352]: I0307 21:17:57.157560 16352 factory.go:221] Registration of the crio container factory successfully Mar 07 21:17:57.157743 master-0 kubenswrapper[16352]: I0307 21:17:57.157717 16352 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 07 21:17:57.157797 master-0 kubenswrapper[16352]: I0307 21:17:57.157769 16352 factory.go:103] Registering Raw factory Mar 07 21:17:57.157797 master-0 kubenswrapper[16352]: I0307 21:17:57.157792 16352 manager.go:1196] Started watching for new ooms in manager Mar 07 21:17:57.158713 master-0 kubenswrapper[16352]: I0307 21:17:57.158658 16352 manager.go:319] Starting recovery of all containers Mar 07 21:17:57.162531 master-0 kubenswrapper[16352]: I0307 21:17:57.162396 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck" seLinuxMountContext="" Mar 07 21:17:57.162633 master-0 kubenswrapper[16352]: I0307 21:17:57.162546 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d" volumeName="kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert" seLinuxMountContext="" Mar 07 21:17:57.162633 master-0 kubenswrapper[16352]: I0307 21:17:57.162570 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client" seLinuxMountContext="" Mar 07 21:17:57.162633 master-0 kubenswrapper[16352]: I0307 21:17:57.162589 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" volumeName="kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d" seLinuxMountContext="" Mar 07 21:17:57.162633 master-0 kubenswrapper[16352]: I0307 21:17:57.162616 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982319eb-2dc2-4faa-85d8-ee11840179fd" volumeName="kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162645 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162671 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3fe386a-dea8-484a-b95a-0f3f475b1f82" volumeName="kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162709 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d64cd1-bd5b-4fbc-972b-000a03c854fe" volumeName="kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162739 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162756 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" volumeName="kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162771 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca25117a-ccd5-4628-8342-e277bb7be0e2" volumeName="kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162797 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy" seLinuxMountContext="" Mar 07 21:17:57.162850 master-0 kubenswrapper[16352]: I0307 21:17:57.162838 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd9cf577-3c49-417b-a6c0-9d307c113221" volumeName="kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162870 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162889 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162913 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162930 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666475e5-df4b-44ef-a2d4-39d84ab91aad" volumeName="kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162957 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162974 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3fe386a-dea8-484a-b95a-0f3f475b1f82" volumeName="kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.162988 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2369ce94-237f-41ad-9875-173578764483" volumeName="kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163014 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6deed9a9-6702-4177-a35d-58ad9930a893" volumeName="kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163035 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" volumeName="kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163061 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7fa7b789-9201-493e-a96d-484a2622301a" volumeName="kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163080 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d50f92ea-1c78-4535-a14c-96b00f2cf377" volumeName="kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163098 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8980370-267c-4168-ba97-d780698533ff" volumeName="kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163120 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc392945-53ad-473c-8803-70e2026712d2" volumeName="kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163149 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163183 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163213 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e94f64e-4a89-4d9d-acbd-80f86bf2f964" volumeName="kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume" seLinuxMountContext="" Mar 07 21:17:57.163215 master-0 kubenswrapper[16352]: I0307 21:17:57.163231 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163249 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08edf29-c53f-452d-880b-e8ce27b05b6f" volumeName="kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163272 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163291 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca25117a-ccd5-4628-8342-e277bb7be0e2" volumeName="kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163319 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163342 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163361 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163388 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="290f6cf4-daa1-4cae-8e91-2411bf81f8b4" volumeName="kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163412 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163436 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96cfa9d3-fc26-42e9-8bac-ff2c25223654" volumeName="kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163451 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="599c055c-3517-46cb-b584-0050b12a7dea" volumeName="kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163469 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163491 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08edf29-c53f-452d-880b-e8ce27b05b6f" volumeName="kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163509 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" volumeName="kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163525 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6deed9a9-6702-4177-a35d-58ad9930a893" volumeName="kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163546 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163573 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96cfa9d3-fc26-42e9-8bac-ff2c25223654" volumeName="kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163602 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163623 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61a9fce6-50e1-413c-9ec0-177d6e903bdd" volumeName="kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163643 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d50f92ea-1c78-4535-a14c-96b00f2cf377" volumeName="kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163672 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163828 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" volumeName="kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163857 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163882 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc392945-53ad-473c-8803-70e2026712d2" volumeName="kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163908 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="183a5212-1b21-44e4-9ed5-2f63f76e652e" volumeName="kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163937 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config" seLinuxMountContext="" Mar 07 21:17:57.163979 master-0 kubenswrapper[16352]: I0307 21:17:57.163957 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" volumeName="kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164035 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164063 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" volumeName="kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164097 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164122 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="666475e5-df4b-44ef-a2d4-39d84ab91aad" volumeName="kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164157 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" volumeName="kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164199 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8980370-267c-4168-ba97-d780698533ff" volumeName="kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164227 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46d1b044-16fb-4442-a554-6b15a8a1c8ae" volumeName="kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164263 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164282 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b269ae2f-44ff-46c7-9039-21fca4a7a790" volumeName="kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164321 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164347 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.164857 master-0 kubenswrapper[16352]: I0307 21:17:57.164372 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" volumeName="kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities" seLinuxMountContext="" Mar 07 21:17:57.165196 master-0 kubenswrapper[16352]: I0307 21:17:57.164915 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="982319eb-2dc2-4faa-85d8-ee11840179fd" volumeName="kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs" seLinuxMountContext="" Mar 07 21:17:57.165196 master-0 kubenswrapper[16352]: I0307 21:17:57.165085 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit" seLinuxMountContext="" Mar 07 21:17:57.165196 master-0 kubenswrapper[16352]: I0307 21:17:57.165129 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca" seLinuxMountContext="" Mar 07 21:17:57.165196 master-0 kubenswrapper[16352]: I0307 21:17:57.165156 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access" seLinuxMountContext="" Mar 07 21:17:57.165305 master-0 kubenswrapper[16352]: I0307 21:17:57.165186 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8512a7f6-889f-483e-960f-1ce3c834e92c" volumeName="kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots" seLinuxMountContext="" Mar 07 21:17:57.165433 master-0 kubenswrapper[16352]: I0307 21:17:57.165223 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8512a7f6-889f-483e-960f-1ce3c834e92c" volumeName="kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf" seLinuxMountContext="" Mar 07 21:17:57.165659 master-0 kubenswrapper[16352]: I0307 21:17:57.165585 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.165732 master-0 kubenswrapper[16352]: I0307 21:17:57.165659 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 07 21:17:57.165732 master-0 kubenswrapper[16352]: I0307 21:17:57.165724 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca" seLinuxMountContext="" Mar 07 21:17:57.165793 master-0 kubenswrapper[16352]: I0307 21:17:57.165748 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85bb04ed-e2d1-496d-8f2c-9555bb3c5d78" volumeName="kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca" seLinuxMountContext="" Mar 07 21:17:57.165793 master-0 kubenswrapper[16352]: I0307 21:17:57.165773 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96cfa9d3-fc26-42e9-8bac-ff2c25223654" volumeName="kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca" seLinuxMountContext="" Mar 07 21:17:57.165848 master-0 kubenswrapper[16352]: I0307 21:17:57.165791 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd" seLinuxMountContext="" Mar 07 21:17:57.165848 master-0 kubenswrapper[16352]: I0307 21:17:57.165837 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps" seLinuxMountContext="" Mar 07 21:17:57.165956 master-0 kubenswrapper[16352]: I0307 21:17:57.165861 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5446df8b-23d4-4bf3-84ac-d8e1d18813af" volumeName="kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.166002 master-0 kubenswrapper[16352]: I0307 21:17:57.165958 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets" seLinuxMountContext="" Mar 07 21:17:57.166002 master-0 kubenswrapper[16352]: I0307 21:17:57.165983 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="290f6cf4-daa1-4cae-8e91-2411bf81f8b4" volumeName="kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs" seLinuxMountContext="" Mar 07 21:17:57.166058 master-0 kubenswrapper[16352]: I0307 21:17:57.166001 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44" seLinuxMountContext="" Mar 07 21:17:57.166058 master-0 kubenswrapper[16352]: I0307 21:17:57.166020 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf" seLinuxMountContext="" Mar 07 21:17:57.166119 master-0 kubenswrapper[16352]: I0307 21:17:57.166058 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca" seLinuxMountContext="" Mar 07 21:17:57.166151 master-0 kubenswrapper[16352]: I0307 21:17:57.166118 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e720291b-0f96-4ebb-80f2-5df7cb194ffc" volumeName="kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv" seLinuxMountContext="" Mar 07 21:17:57.166209 master-0 kubenswrapper[16352]: I0307 21:17:57.166184 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca" seLinuxMountContext="" Mar 07 21:17:57.166271 master-0 kubenswrapper[16352]: I0307 21:17:57.166227 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert" seLinuxMountContext="" Mar 07 21:17:57.166315 master-0 kubenswrapper[16352]: I0307 21:17:57.166278 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6deed9a9-6702-4177-a35d-58ad9930a893" volumeName="kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config" seLinuxMountContext="" Mar 07 21:17:57.166424 master-0 kubenswrapper[16352]: I0307 21:17:57.166377 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" volumeName="kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config" seLinuxMountContext="" Mar 07 21:17:57.166468 master-0 kubenswrapper[16352]: I0307 21:17:57.166424 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d" volumeName="kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs" seLinuxMountContext="" Mar 07 21:17:57.166468 master-0 kubenswrapper[16352]: I0307 21:17:57.166445 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d50f92ea-1c78-4535-a14c-96b00f2cf377" volumeName="kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate" seLinuxMountContext="" Mar 07 21:17:57.166468 master-0 kubenswrapper[16352]: I0307 21:17:57.166461 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46d1b044-16fb-4442-a554-6b15a8a1c8ae" volumeName="kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config" seLinuxMountContext="" Mar 07 21:17:57.166554 master-0 kubenswrapper[16352]: I0307 21:17:57.166478 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" volumeName="kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn" seLinuxMountContext="" Mar 07 21:17:57.166554 master-0 kubenswrapper[16352]: I0307 21:17:57.166505 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b12701eb-4226-4f9c-9398-ad0c3fea7451" volumeName="kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert" seLinuxMountContext="" Mar 07 21:17:57.166554 master-0 kubenswrapper[16352]: I0307 21:17:57.166522 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d64cd1-bd5b-4fbc-972b-000a03c854fe" volumeName="kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p" seLinuxMountContext="" Mar 07 21:17:57.166647 master-0 kubenswrapper[16352]: I0307 21:17:57.166564 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides" seLinuxMountContext="" Mar 07 21:17:57.166647 master-0 kubenswrapper[16352]: I0307 21:17:57.166604 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5446df8b-23d4-4bf3-84ac-d8e1d18813af" volumeName="kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls" seLinuxMountContext="" Mar 07 21:17:57.166755 master-0 kubenswrapper[16352]: I0307 21:17:57.166727 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" volumeName="kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls" seLinuxMountContext="" Mar 07 21:17:57.166930 master-0 kubenswrapper[16352]: I0307 21:17:57.166779 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8512a7f6-889f-483e-960f-1ce3c834e92c" volumeName="kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167239 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc6fdd7-cbf1-416d-a986-bbd6ba259c05" volumeName="kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167284 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d50f92ea-1c78-4535-a14c-96b00f2cf377" volumeName="kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167396 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167445 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167491 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" volumeName="kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167540 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167565 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f69a884-5fe8-4c03-8258-ff35396efc30" volumeName="kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167591 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fc392945-53ad-473c-8803-70e2026712d2" volumeName="kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167633 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167657 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e720291b-0f96-4ebb-80f2-5df7cb194ffc" volumeName="kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167715 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167743 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167796 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167872 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167894 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f08edf29-c53f-452d-880b-e8ce27b05b6f" volumeName="kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content" seLinuxMountContext="" Mar 07 21:17:57.167934 master-0 kubenswrapper[16352]: I0307 21:17:57.167924 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.167992 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="599c055c-3517-46cb-b584-0050b12a7dea" volumeName="kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168116 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8512a7f6-889f-483e-960f-1ce3c834e92c" volumeName="kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168144 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3fe386a-dea8-484a-b95a-0f3f475b1f82" volumeName="kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168161 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168211 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168231 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" volumeName="kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168247 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" volumeName="kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168261 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168293 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca25117a-ccd5-4628-8342-e277bb7be0e2" volumeName="kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168310 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168428 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d" volumeName="kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168458 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46548c2c-6a8a-4382-87de-2c7a8442a33c" volumeName="kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168491 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69851821-e1fc-44a8-98df-0cfe9d564126" volumeName="kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168516 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a9d64cd1-bd5b-4fbc-972b-000a03c854fe" volumeName="kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168575 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168601 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" volumeName="kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168618 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f8c93e0d-54e5-4c80-9d69-a70317baeacf" volumeName="kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168642 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168657 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9515e34b-addf-487a-adf8-c6ef24fcc54c" volumeName="kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168675 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168733 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc6fdd7-cbf1-416d-a986-bbd6ba259c05" volumeName="kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168753 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd633b72-3d0b-4601-a2c2-3f487d943b35" volumeName="kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168775 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5446df8b-23d4-4bf3-84ac-d8e1d18813af" volumeName="kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168791 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="69851821-e1fc-44a8-98df-0cfe9d564126" volumeName="kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168807 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" volumeName="kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168828 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="47ecf172-666e-4360-97ff-bd9dbccc1fd6" volumeName="kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168843 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b339e6a-cae6-416a-963b-2fd23cecba96" volumeName="kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168859 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168931 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8269652e-360f-43ef-9e7d-473c5f478275" volumeName="kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168973 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e543d99f-e0dc-49be-95bd-c39eabd05ce8" volumeName="kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.168993 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169019 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" volumeName="kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169048 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169163 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ca25117a-ccd5-4628-8342-e277bb7be0e2" volumeName="kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169195 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="29624e4f-d970-4dfa-a8f1-515b73397c8f" volumeName="kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169215 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e38fc940-e59a-45ff-978b-fdcdc534a2a5" volumeName="kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz" seLinuxMountContext="" Mar 07 21:17:57.169253 master-0 kubenswrapper[16352]: I0307 21:17:57.169230 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169271 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85bb04ed-e2d1-496d-8f2c-9555bb3c5d78" volumeName="kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169404 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e3fe386a-dea8-484a-b95a-0f3f475b1f82" volumeName="kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169441 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7d462ed3-d191-42a5-b8e0-79ab9af13991" volumeName="kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169477 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b88c5fbe-e19f-45b3-ab03-e1626f95776d" volumeName="kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169501 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021" volumeName="kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169589 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" volumeName="kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169615 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f69a884-5fe8-4c03-8258-ff35396efc30" volumeName="kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169652 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="85bb04ed-e2d1-496d-8f2c-9555bb3c5d78" volumeName="kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169740 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ab2f6566-730d-46f5-92ed-79e3039d24e8" volumeName="kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169796 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f2ca65f5-7dbe-4407-b38e-713592f62136" volumeName="kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169823 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169847 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="61a9fce6-50e1-413c-9ec0-177d6e903bdd" volumeName="kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169893 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6deed9a9-6702-4177-a35d-58ad9930a893" volumeName="kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169959 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd9cf577-3c49-417b-a6c0-9d307c113221" volumeName="kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.169981 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" volumeName="kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170046 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="183a5212-1b21-44e4-9ed5-2f63f76e652e" volumeName="kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170061 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="183a5212-1b21-44e4-9ed5-2f63f76e652e" volumeName="kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170080 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" volumeName="kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170093 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="27b149f7-6aff-45f3-b935-e65279f2f9ee" volumeName="kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170166 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8512a7f6-889f-483e-960f-1ce3c834e92c" volumeName="kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170206 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170221 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bbc6fdd7-cbf1-416d-a986-bbd6ba259c05" volumeName="kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170887 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021" volumeName="kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170907 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2369ce94-237f-41ad-9875-173578764483" volumeName="kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170947 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3faedef9-d507-48aa-82a8-f3dc9b5adeef" volumeName="kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170963 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f69a884-5fe8-4c03-8258-ff35396efc30" volumeName="kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170979 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e94f64e-4a89-4d9d-acbd-80f86bf2f964" volumeName="kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.170994 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="2369ce94-237f-41ad-9875-173578764483" volumeName="kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171008 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3caff2c1-f178-4e16-916d-27ccf178ff37" volumeName="kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171024 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171038 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171053 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b12701eb-4226-4f9c-9398-ad0c3fea7451" volumeName="kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171069 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="420c6d8f-6313-4d6c-b817-420797fc6878" volumeName="kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171083 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4e94f64e-4a89-4d9d-acbd-80f86bf2f964" volumeName="kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171098 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7f69a884-5fe8-4c03-8258-ff35396efc30" volumeName="kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171113 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" volumeName="kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171128 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="dd310b71-6c79-4169-8b8a-7b3fe35a97fd" volumeName="kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171144 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d" volumeName="kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171160 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" volumeName="kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171177 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" volumeName="kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171192 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="abfb5602-7255-43d7-a510-e7f94885887e" volumeName="kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171207 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46d1b044-16fb-4442-a554-6b15a8a1c8ae" volumeName="kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171221 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171236 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5f82d4aa-0cb5-477f-944e-745a21d124fc" volumeName="kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171250 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6d5765e6-80cc-404b-b375-c109febd1843" volumeName="kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171266 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="15270349-f3aa-43bc-88a8-f0fff3aa2528" volumeName="kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171282 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="599c055c-3517-46cb-b584-0050b12a7dea" volumeName="kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171299 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d50f92ea-1c78-4535-a14c-96b00f2cf377" volumeName="kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171313 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171330 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" volumeName="kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171344 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="290f6cf4-daa1-4cae-8e91-2411bf81f8b4" volumeName="kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171358 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="46d1b044-16fb-4442-a554-6b15a8a1c8ae" volumeName="kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171373 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6deed9a9-6702-4177-a35d-58ad9930a893" volumeName="kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171402 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="24f69689-ff12-4786-af05-61429e9eadf8" volumeName="kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171423 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="290f6cf4-daa1-4cae-8e91-2411bf81f8b4" volumeName="kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171441 16352 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b12701eb-4226-4f9c-9398-ad0c3fea7451" volumeName="kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config" seLinuxMountContext="" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171457 16352 reconstruct.go:97] "Volume reconstruction finished" Mar 07 21:17:57.171506 master-0 kubenswrapper[16352]: I0307 21:17:57.171469 16352 reconciler.go:26] "Reconciler: start to sync state" Mar 07 21:17:57.185713 master-0 kubenswrapper[16352]: I0307 21:17:57.185507 16352 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 07 21:17:57.187858 master-0 kubenswrapper[16352]: I0307 21:17:57.187802 16352 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 07 21:17:57.187858 master-0 kubenswrapper[16352]: I0307 21:17:57.187854 16352 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 07 21:17:57.188033 master-0 kubenswrapper[16352]: I0307 21:17:57.188007 16352 kubelet.go:2335] "Starting kubelet main sync loop" Mar 07 21:17:57.188267 master-0 kubenswrapper[16352]: E0307 21:17:57.188229 16352 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 07 21:17:57.190239 master-0 kubenswrapper[16352]: I0307 21:17:57.190179 16352 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 21:17:57.190504 master-0 kubenswrapper[16352]: I0307 21:17:57.190458 16352 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 07 21:17:57.200221 master-0 kubenswrapper[16352]: I0307 21:17:57.200160 16352 generic.go:334] "Generic (PLEG): container finished" podID="abfb5602-7255-43d7-a510-e7f94885887e" containerID="98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429" exitCode=0 Mar 07 21:17:57.208509 master-0 kubenswrapper[16352]: I0307 21:17:57.208451 16352 generic.go:334] "Generic (PLEG): container finished" podID="e543d99f-e0dc-49be-95bd-c39eabd05ce8" containerID="bb9512b327c952122a6ba9c90bf697a16d6d7a153e8ba4baf488a717c15e85eb" exitCode=0 Mar 07 21:17:57.211698 master-0 kubenswrapper[16352]: I0307 21:17:57.211628 16352 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e" exitCode=2 Mar 07 21:17:57.222246 master-0 kubenswrapper[16352]: I0307 21:17:57.222189 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-config-operator_openshift-config-operator-64488f9d78-cb227_29624e4f-d970-4dfa-a8f1-515b73397c8f/openshift-config-operator/1.log" Mar 07 21:17:57.222805 master-0 kubenswrapper[16352]: I0307 21:17:57.222677 16352 generic.go:334] "Generic (PLEG): container finished" podID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerID="422fc7c90ae0810330b8638468887bc09b7443376ae150d6d32db5bf56fae2bc" exitCode=255 Mar 07 21:17:57.222968 master-0 kubenswrapper[16352]: I0307 21:17:57.222807 16352 generic.go:334] "Generic (PLEG): container finished" podID="29624e4f-d970-4dfa-a8f1-515b73397c8f" containerID="7b14e0d42b70cc70f5e51131b552d35ae08e2304284eb28296a108002b51512b" exitCode=0 Mar 07 21:17:57.247997 master-0 kubenswrapper[16352]: I0307 21:17:57.247943 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" exitCode=0 Mar 07 21:17:57.247997 master-0 kubenswrapper[16352]: I0307 21:17:57.247988 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" exitCode=0 Mar 07 21:17:57.247997 master-0 kubenswrapper[16352]: I0307 21:17:57.247999 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" exitCode=0 Mar 07 21:17:57.250080 master-0 kubenswrapper[16352]: I0307 21:17:57.250019 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-2-master-0_bc5c4a14-0fdc-4c09-abda-7a2277a20c54/installer/0.log" Mar 07 21:17:57.250164 master-0 kubenswrapper[16352]: I0307 21:17:57.250096 16352 generic.go:334] "Generic (PLEG): container finished" podID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerID="f731d58484b6e995b134d609352f74f3a18338de0be2a0cddb04f00bff760ac6" exitCode=1 Mar 07 21:17:57.256162 master-0 kubenswrapper[16352]: I0307 21:17:57.255999 16352 generic.go:334] "Generic (PLEG): container finished" podID="e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9" containerID="00d0a15073fd7fa3444cc6741cebfa512af7efaa071a744a0077952511813908" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273202 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="8fa7422d23bcb03f45ab2c3bec3ea5e6214caa8f28b047daca9c932d4eca1830" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273269 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="9574294e43be3e31b38cb24910c3f9a3961ac6d7fef3d8e88cef73fac06c22e3" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273279 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="22f599619e79420fd9506a4f183f60ba821b3ac500c2322da39e388d594122e4" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273287 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="a27fb1b48c1a71a257bd1f0e26afd03b783b613cf862675ed38b35ffc09792a8" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273294 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="1204b0d0bd4ef37ca4508ca7c0bfef9f1e850dc26e2ddde2b7523df8be7455e3" exitCode=0 Mar 07 21:17:57.273283 master-0 kubenswrapper[16352]: I0307 21:17:57.273302 16352 generic.go:334] "Generic (PLEG): container finished" podID="3caff2c1-f178-4e16-916d-27ccf178ff37" containerID="7321d4d6e798cb535bde4f9b51f6814dd5e6706005dac86d4315f2c88fc7fa27" exitCode=0 Mar 07 21:17:57.277855 master-0 kubenswrapper[16352]: I0307 21:17:57.277815 16352 generic.go:334] "Generic (PLEG): container finished" podID="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" containerID="87c8917d09451c4b2cd526c22f55e1e1633a895f47f1081055707fe4874946ed" exitCode=0 Mar 07 21:17:57.277964 master-0 kubenswrapper[16352]: I0307 21:17:57.277944 16352 generic.go:334] "Generic (PLEG): container finished" podID="c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9" containerID="58159da9ebda8be7e4de56fce5a62c915f96081b06accf5dd15dc2fbdbd7247f" exitCode=0 Mar 07 21:17:57.280583 master-0 kubenswrapper[16352]: I0307 21:17:57.280540 16352 generic.go:334] "Generic (PLEG): container finished" podID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerID="5d8a696d04df358a26bc157288f94a3ff4652e100c1ed368a8504d7b4df97ebb" exitCode=0 Mar 07 21:17:57.286270 master-0 kubenswrapper[16352]: I0307 21:17:57.286248 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-operator_ingress-operator-677db989d6-tklw9_47ecf172-666e-4360-97ff-bd9dbccc1fd6/ingress-operator/0.log" Mar 07 21:17:57.286396 master-0 kubenswrapper[16352]: I0307 21:17:57.286373 16352 generic.go:334] "Generic (PLEG): container finished" podID="47ecf172-666e-4360-97ff-bd9dbccc1fd6" containerID="d9c9700ef3cdaba6833e00d44e39806385f696f37ff17a4df92695c36e563c13" exitCode=1 Mar 07 21:17:57.288320 master-0 kubenswrapper[16352]: E0307 21:17:57.288282 16352 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 07 21:17:57.293149 master-0 kubenswrapper[16352]: I0307 21:17:57.293089 16352 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54" exitCode=0 Mar 07 21:17:57.293149 master-0 kubenswrapper[16352]: I0307 21:17:57.293143 16352 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="a4635f7548cc73236087a85660453eabf881ce7b06599d4a7dd2447ded616584" exitCode=0 Mar 07 21:17:57.293276 master-0 kubenswrapper[16352]: I0307 21:17:57.293159 16352 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="2469613253a6fb83e25c2520824b0decb6e3207bc68cca5286a33e44b6873206" exitCode=0 Mar 07 21:17:57.299235 master-0 kubenswrapper[16352]: I0307 21:17:57.299176 16352 generic.go:334] "Generic (PLEG): container finished" podID="420c6d8f-6313-4d6c-b817-420797fc6878" containerID="89e83b02510db448aa7211c7a69aa7fdf926031ee29094a8ecb9aeeb18ccc925" exitCode=0 Mar 07 21:17:57.301276 master-0 kubenswrapper[16352]: I0307 21:17:57.301254 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/0.log" Mar 07 21:17:57.301856 master-0 kubenswrapper[16352]: I0307 21:17:57.301828 16352 generic.go:334] "Generic (PLEG): container finished" podID="27b149f7-6aff-45f3-b935-e65279f2f9ee" containerID="98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247" exitCode=1 Mar 07 21:17:57.305871 master-0 kubenswrapper[16352]: I0307 21:17:57.305823 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_olm-operator-d64cfc9db-qd6xh_69851821-e1fc-44a8-98df-0cfe9d564126/olm-operator/0.log" Mar 07 21:17:57.305982 master-0 kubenswrapper[16352]: I0307 21:17:57.305871 16352 generic.go:334] "Generic (PLEG): container finished" podID="69851821-e1fc-44a8-98df-0cfe9d564126" containerID="7aaed8a833b3068593d26b6804ec3a006285f7a402c4ef65546ea1c84ea6ae4d" exitCode=1 Mar 07 21:17:57.309311 master-0 kubenswrapper[16352]: I0307 21:17:57.309263 16352 generic.go:334] "Generic (PLEG): container finished" podID="f08edf29-c53f-452d-880b-e8ce27b05b6f" containerID="f6257eda77e6ac921b623c45ec6e6f8e7833cf0c08e715e3b224823a05866040" exitCode=0 Mar 07 21:17:57.309311 master-0 kubenswrapper[16352]: I0307 21:17:57.309299 16352 generic.go:334] "Generic (PLEG): container finished" podID="f08edf29-c53f-452d-880b-e8ce27b05b6f" containerID="cbc42b42c68bace1ed2fefd81d3e1aeee69e8aeb452acfb8ef11e0a4a41a9443" exitCode=0 Mar 07 21:17:57.318899 master-0 kubenswrapper[16352]: I0307 21:17:57.318842 16352 generic.go:334] "Generic (PLEG): container finished" podID="5f82d4aa-0cb5-477f-944e-745a21d124fc" containerID="42f741a1d8745f4ba4855310764e131077825a56cb2981843ca7f7c641b06c4d" exitCode=0 Mar 07 21:17:57.331727 master-0 kubenswrapper[16352]: I0307 21:17:57.330902 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_catalog-operator-7d9c49f57b-j454x_7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149/catalog-operator/0.log" Mar 07 21:17:57.331727 master-0 kubenswrapper[16352]: I0307 21:17:57.330955 16352 generic.go:334] "Generic (PLEG): container finished" podID="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" containerID="6690322ef152ddb1743025780f4e212cb381fc5357beb0407cc2777292df2c5a" exitCode=1 Mar 07 21:17:57.345211 master-0 kubenswrapper[16352]: I0307 21:17:57.336359 16352 generic.go:334] "Generic (PLEG): container finished" podID="3faedef9-d507-48aa-82a8-f3dc9b5adeef" containerID="f737e30d954aa064b6cfef3a212e4d7f5057ece37e1afcdb2a92dd75d8adab26" exitCode=0 Mar 07 21:17:57.345211 master-0 kubenswrapper[16352]: I0307 21:17:57.340070 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-config-operator_kube-rbac-proxy-crio-master-0_e9add8df47182fc2eaf8cd78016ebe72/kube-rbac-proxy-crio/2.log" Mar 07 21:17:57.345211 master-0 kubenswrapper[16352]: I0307 21:17:57.340588 16352 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932" exitCode=1 Mar 07 21:17:57.345211 master-0 kubenswrapper[16352]: I0307 21:17:57.340607 16352 generic.go:334] "Generic (PLEG): container finished" podID="e9add8df47182fc2eaf8cd78016ebe72" containerID="f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248" exitCode=0 Mar 07 21:17:57.345211 master-0 kubenswrapper[16352]: I0307 21:17:57.343413 16352 generic.go:334] "Generic (PLEG): container finished" podID="7d462ed3-d191-42a5-b8e0-79ab9af13991" containerID="0d151e14131d9760e6564e5100dd52c146e8d9e99a88e5e1621708256085d68d" exitCode=0 Mar 07 21:17:57.351840 master-0 kubenswrapper[16352]: I0307 21:17:57.351799 16352 generic.go:334] "Generic (PLEG): container finished" podID="b88c5fbe-e19f-45b3-ab03-e1626f95776d" containerID="4dd4ab96de66a81d1a97cd72bb912ec500681a0000024a0cfaf545c2eaf36106" exitCode=0 Mar 07 21:17:57.354092 master-0 kubenswrapper[16352]: I0307 21:17:57.354030 16352 generic.go:334] "Generic (PLEG): container finished" podID="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" containerID="001b65d1db011b8bb96e16a7cb4e30e0932ee8b9e303affc470347e6b0c4af77" exitCode=0 Mar 07 21:17:57.354092 master-0 kubenswrapper[16352]: I0307 21:17:57.354084 16352 generic.go:334] "Generic (PLEG): container finished" podID="7f65054f-caf3-4cd3-889e-8d5a5376b1b8" containerID="35e0e5cfb37740a966e4bb6ed64ac7190b87360fa40dbcf877d2b0069b3065cd" exitCode=0 Mar 07 21:17:57.359101 master-0 kubenswrapper[16352]: I0307 21:17:57.359064 16352 generic.go:334] "Generic (PLEG): container finished" podID="2d827a93-49e5-4694-b119-957cfa9bd648" containerID="485cabca7a9edbb9a83d8ef9ee43891f8c296cb8958998f7a4fa97d4fc8e25c3" exitCode=0 Mar 07 21:17:57.365860 master-0 kubenswrapper[16352]: I0307 21:17:57.365818 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612" exitCode=0 Mar 07 21:17:57.375440 master-0 kubenswrapper[16352]: I0307 21:17:57.375362 16352 generic.go:334] "Generic (PLEG): container finished" podID="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" containerID="2aa15de1b2bb19d02b5747770e2bfa186549430d792f08f2ab12bc57e19314a5" exitCode=0 Mar 07 21:17:57.375440 master-0 kubenswrapper[16352]: I0307 21:17:57.375400 16352 generic.go:334] "Generic (PLEG): container finished" podID="5625eb9f-c80b-47b1-b70c-aa636fbc03ac" containerID="0b4bb2c8e80fc01b0e3b4c15c93598e07e450d614fd19ce1345979feccb7c709" exitCode=0 Mar 07 21:17:57.376925 master-0 kubenswrapper[16352]: I0307 21:17:57.376879 16352 generic.go:334] "Generic (PLEG): container finished" podID="2357c135-5d09-4657-9038-48d25ed55b2d" containerID="c99ad91f1912453e3999a78e354c969699bc344538ab4adcf769bc12a98842c2" exitCode=0 Mar 07 21:17:57.384082 master-0 kubenswrapper[16352]: I0307 21:17:57.384002 16352 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" exitCode=1 Mar 07 21:17:57.388852 master-0 kubenswrapper[16352]: I0307 21:17:57.388766 16352 generic.go:334] "Generic (PLEG): container finished" podID="5b339e6a-cae6-416a-963b-2fd23cecba96" containerID="4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91" exitCode=0 Mar 07 21:17:57.394016 master-0 kubenswrapper[16352]: I0307 21:17:57.393953 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-1-master-0_34e734b7-82d6-493d-ace8-1945b2c08c6d/installer/0.log" Mar 07 21:17:57.394141 master-0 kubenswrapper[16352]: I0307 21:17:57.394028 16352 generic.go:334] "Generic (PLEG): container finished" podID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerID="b18addaef135e00fefdd51e68b734344679afa8f4606f39797d35e107db0fa22" exitCode=1 Mar 07 21:17:57.398984 master-0 kubenswrapper[16352]: I0307 21:17:57.398913 16352 generic.go:334] "Generic (PLEG): container finished" podID="24f69689-ff12-4786-af05-61429e9eadf8" containerID="c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232" exitCode=0 Mar 07 21:17:57.403598 master-0 kubenswrapper[16352]: I0307 21:17:57.403526 16352 generic.go:334] "Generic (PLEG): container finished" podID="e757a93e-91aa-4fce-949b-4c51a060528e" containerID="a049a3a4077135fa9e02b1a9804eac864bd6874b0847dc250b8650ce1e94ce1d" exitCode=0 Mar 07 21:17:57.405396 master-0 kubenswrapper[16352]: I0307 21:17:57.405345 16352 generic.go:334] "Generic (PLEG): container finished" podID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerID="ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33" exitCode=0 Mar 07 21:17:57.427231 master-0 kubenswrapper[16352]: I0307 21:17:57.427175 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-v4xm9_f8980370-267c-4168-ba97-d780698533ff/network-operator/0.log" Mar 07 21:17:57.427343 master-0 kubenswrapper[16352]: I0307 21:17:57.427229 16352 generic.go:334] "Generic (PLEG): container finished" podID="f8980370-267c-4168-ba97-d780698533ff" containerID="a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db" exitCode=255 Mar 07 21:17:57.429292 master-0 kubenswrapper[16352]: I0307 21:17:57.429261 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-4-master-0_ddc814a4-b865-4a35-b5f8-f54af449fe25/installer/0.log" Mar 07 21:17:57.429292 master-0 kubenswrapper[16352]: I0307 21:17:57.429295 16352 generic.go:334] "Generic (PLEG): container finished" podID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerID="f84fa34d05ad67aec62ca362c7866be59185619297d89ccd25b8d12c9a739a50" exitCode=1 Mar 07 21:17:57.431944 master-0 kubenswrapper[16352]: I0307 21:17:57.431782 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-controller-manager-operator_openshift-controller-manager-operator-8565d84698-98wdp_bd633b72-3d0b-4601-a2c2-3f487d943b35/openshift-controller-manager-operator/0.log" Mar 07 21:17:57.431944 master-0 kubenswrapper[16352]: I0307 21:17:57.431815 16352 generic.go:334] "Generic (PLEG): container finished" podID="bd633b72-3d0b-4601-a2c2-3f487d943b35" containerID="8db5d27113ab5fae894c6cc0107da033c6196250dc7c341eeb4aaf2ff2d3a924" exitCode=1 Mar 07 21:17:57.433303 master-0 kubenswrapper[16352]: I0307 21:17:57.433231 16352 generic.go:334] "Generic (PLEG): container finished" podID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerID="f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b" exitCode=0 Mar 07 21:17:57.488485 master-0 kubenswrapper[16352]: E0307 21:17:57.488383 16352 kubelet.go:2359] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Mar 07 21:17:57.622793 master-0 kubenswrapper[16352]: I0307 21:17:57.619339 16352 manager.go:324] Recovery completed Mar 07 21:17:57.722104 master-0 kubenswrapper[16352]: I0307 21:17:57.722021 16352 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 07 21:17:57.722104 master-0 kubenswrapper[16352]: I0307 21:17:57.722076 16352 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 07 21:17:57.722472 master-0 kubenswrapper[16352]: I0307 21:17:57.722136 16352 state_mem.go:36] "Initialized new in-memory state store" Mar 07 21:17:57.722472 master-0 kubenswrapper[16352]: I0307 21:17:57.722449 16352 state_mem.go:88] "Updated default CPUSet" cpuSet="" Mar 07 21:17:57.722590 master-0 kubenswrapper[16352]: I0307 21:17:57.722473 16352 state_mem.go:96] "Updated CPUSet assignments" assignments={} Mar 07 21:17:57.722590 master-0 kubenswrapper[16352]: I0307 21:17:57.722511 16352 state_checkpoint.go:136] "State checkpoint: restored state from checkpoint" Mar 07 21:17:57.722590 master-0 kubenswrapper[16352]: I0307 21:17:57.722524 16352 state_checkpoint.go:137] "State checkpoint: defaultCPUSet" defaultCpuSet="" Mar 07 21:17:57.722590 master-0 kubenswrapper[16352]: I0307 21:17:57.722573 16352 policy_none.go:49] "None policy: Start" Mar 07 21:17:57.727019 master-0 kubenswrapper[16352]: I0307 21:17:57.726947 16352 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 07 21:17:57.727252 master-0 kubenswrapper[16352]: I0307 21:17:57.727040 16352 state_mem.go:35] "Initializing new in-memory state store" Mar 07 21:17:57.727479 master-0 kubenswrapper[16352]: I0307 21:17:57.727436 16352 state_mem.go:75] "Updated machine memory state" Mar 07 21:17:57.727479 master-0 kubenswrapper[16352]: I0307 21:17:57.727467 16352 state_checkpoint.go:82] "State checkpoint: restored state from checkpoint" Mar 07 21:17:57.749225 master-0 kubenswrapper[16352]: I0307 21:17:57.749188 16352 manager.go:334] "Starting Device Plugin manager" Mar 07 21:17:57.749464 master-0 kubenswrapper[16352]: I0307 21:17:57.749446 16352 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 07 21:17:57.749545 master-0 kubenswrapper[16352]: I0307 21:17:57.749533 16352 server.go:79] "Starting device plugin registration server" Mar 07 21:17:57.750375 master-0 kubenswrapper[16352]: I0307 21:17:57.750357 16352 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 07 21:17:57.750549 master-0 kubenswrapper[16352]: I0307 21:17:57.750465 16352 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 07 21:17:57.750796 master-0 kubenswrapper[16352]: I0307 21:17:57.750739 16352 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 07 21:17:57.751055 master-0 kubenswrapper[16352]: I0307 21:17:57.750995 16352 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 07 21:17:57.751055 master-0 kubenswrapper[16352]: I0307 21:17:57.751030 16352 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 07 21:17:57.850913 master-0 kubenswrapper[16352]: I0307 21:17:57.850790 16352 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:17:57.854994 master-0 kubenswrapper[16352]: I0307 21:17:57.854600 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:17:57.854994 master-0 kubenswrapper[16352]: I0307 21:17:57.854666 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:17:57.854994 master-0 kubenswrapper[16352]: I0307 21:17:57.854721 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:17:57.854994 master-0 kubenswrapper[16352]: I0307 21:17:57.854857 16352 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:17:57.859886 master-0 kubenswrapper[16352]: E0307 21:17:57.859762 16352 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 07 21:17:57.889747 master-0 kubenswrapper[16352]: I0307 21:17:57.889511 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0","kube-system/bootstrap-kube-scheduler-master-0","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0","openshift-etcd/etcd-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:17:57.891327 master-0 kubenswrapper[16352]: I0307 21:17:57.891217 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"554ffc5919fe7a46fc0ad2b26594bc2dec62e5f792ce74d74fe8d549af25bf01"} Mar 07 21:17:57.891327 master-0 kubenswrapper[16352]: I0307 21:17:57.891317 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerDied","Data":"32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891341 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891356 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" event={"ID":"f78c05e1499b533b83f091333d61f045","Type":"ContainerStarted","Data":"a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891477 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891501 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891517 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891533 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891549 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891565 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31"} Mar 07 21:17:57.891564 master-0 kubenswrapper[16352]: I0307 21:17:57.891586 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891607 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerDied","Data":"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891626 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"8e52bef89f4b50e4590a1719bcc5d7e5","Type":"ContainerStarted","Data":"c5fe741f56a7c124f0c5d897131eff835b06c242a678fc6fc3ec3b91d7d391b6"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891648 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae393ba35dc3e8c28db54e63f526ea5a6d375dafe0c9fef22965081dde677e6d" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891759 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cfdf5d1c192e96137200ef5781636512d3b3011cc2213de1177b003ae8bafb4a" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891941 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e34e1ead14a1f46ae17df72bae6b5228c67e70a551aab1dde4319fa7bdff201f" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891958 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"a056ccba22060bdb53ac003460ab1c7bac5832040445f86cf7efe33efd5a3ab2"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.891981 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"625aa1c428d191eb81000bc2269d91379df01d249490956a097136555eab8932"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892000 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerDied","Data":"f775564de6004b1533b00fbc2fd4348436f4183f4b5381b615f45abdd8af0248"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892016 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" event={"ID":"e9add8df47182fc2eaf8cd78016ebe72","Type":"ContainerStarted","Data":"8a1cbe644565b43a4da169795457d15c47700b9e972a545ed433cb9c25264e61"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892079 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5979bdc29bf43a37e21ede7928de7672454b5be0dc2586ea081f6897456b047c" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892091 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5a8709a9cdf03339a9e48b29f2f9c191aba8f725156bf24ecadc366413648838" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892106 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892118 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892130 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892141 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892153 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892166 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerDied","Data":"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892178 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"cdcecc61ff5eeb08bd2a3ac12599e4f9","Type":"ContainerStarted","Data":"f15e8f8db26fbd6b95afa2f46e4951a7cba8bb576b3fcbe8a5a4c88bb100dafc"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892214 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef98d2107480b4bd6f967de3d6f619d44a784a65573272c4ea9717c84d83ed26" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892224 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892235 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerDied","Data":"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892251 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="kube-system/bootstrap-kube-scheduler-master-0" event={"ID":"a1a56802af72ce1aac6b5077f1695ac0","Type":"ContainerStarted","Data":"b4fca5b617da316e897c888591517ee6b6d02e9f77cffb24422e96622b9ff582"} Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892273 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ccf912ee4b17090925c8bd3ce2fd46e80c3fa5968c7c32c4d89480c998e7afa2" Mar 07 21:17:57.892233 master-0 kubenswrapper[16352]: I0307 21:17:57.892288 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6de23860b0b81dd71d1a71f02b3b23b5ac8368494a9752dfe36eb798dc3827b1" Mar 07 21:17:57.893727 master-0 kubenswrapper[16352]: I0307 21:17:57.892307 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"d24d032319a9f87acbbf34deb36cb14122c07e93e1e3dd0d42d28beaf572ecc6"} Mar 07 21:17:57.893727 master-0 kubenswrapper[16352]: I0307 21:17:57.892321 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"f417e14665db2ffffa887ce21c9ff0ed","Type":"ContainerStarted","Data":"f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174"} Mar 07 21:17:57.893727 master-0 kubenswrapper[16352]: I0307 21:17:57.892341 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34c367e3b7cd662a238cd3cf60724c5f41e1100b6bc750255dda8f40be5bf92e" Mar 07 21:17:57.893727 master-0 kubenswrapper[16352]: I0307 21:17:57.892400 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="463d8bfc31fe475b18975fa1110d938e01959c570bcc75066d9a8d30bafab290" Mar 07 21:17:57.893727 master-0 kubenswrapper[16352]: I0307 21:17:57.892420 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="67e0f8c1f7a45675a4623829608878eb12540fff9458a8e44361ada4a21cc9a2" Mar 07 21:17:57.992371 master-0 kubenswrapper[16352]: I0307 21:17:57.992260 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:57.992371 master-0 kubenswrapper[16352]: I0307 21:17:57.992344 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:57.992371 master-0 kubenswrapper[16352]: I0307 21:17:57.992398 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992431 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992457 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992482 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992501 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992520 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992544 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992564 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992581 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992598 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992674 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992712 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992731 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992748 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992768 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992788 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992804 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992865 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992883 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992909 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:57.993034 master-0 kubenswrapper[16352]: I0307 21:17:57.992927 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.060924 master-0 kubenswrapper[16352]: I0307 21:17:58.060846 16352 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:17:58.065059 master-0 kubenswrapper[16352]: I0307 21:17:58.064983 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:17:58.065212 master-0 kubenswrapper[16352]: I0307 21:17:58.065067 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:17:58.065212 master-0 kubenswrapper[16352]: I0307 21:17:58.065082 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:17:58.065381 master-0 kubenswrapper[16352]: I0307 21:17:58.065257 16352 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:17:58.069075 master-0 kubenswrapper[16352]: E0307 21:17:58.069007 16352 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"master-0\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="master-0" Mar 07 21:17:58.093395 master-0 kubenswrapper[16352]: I0307 21:17:58.093288 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.093395 master-0 kubenswrapper[16352]: I0307 21:17:58.093403 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093449 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093455 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093497 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093600 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093639 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093669 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093651 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.093842 master-0 kubenswrapper[16352]: I0307 21:17:58.093732 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093877 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093886 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093921 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093955 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093995 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094037 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094011 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094078 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094114 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094124 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094172 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094172 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094210 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"bootstrap-kube-scheduler-master-0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094283 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.093803 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:58.094316 master-0 kubenswrapper[16352]: I0307 21:17:58.094260 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094411 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094464 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094511 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094542 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/e9add8df47182fc2eaf8cd78016ebe72-etc-kube\") pod \"kube-rbac-proxy-crio-master-0\" (UID: \"e9add8df47182fc2eaf8cd78016ebe72\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094556 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094610 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094646 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094612 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094702 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094749 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094793 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094826 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094828 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094851 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094887 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094937 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094949 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"bootstrap-kube-controller-manager-master-0\" (UID: \"f78c05e1499b533b83f091333d61f045\") " pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094994 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.095277 master-0 kubenswrapper[16352]: I0307 21:17:58.094999 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"etcd-master-0\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.121516 master-0 kubenswrapper[16352]: I0307 21:17:58.121404 16352 apiserver.go:52] "Watching apiserver" Mar 07 21:17:58.151081 master-0 kubenswrapper[16352]: I0307 21:17:58.150884 16352 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 21:17:58.154655 master-0 kubenswrapper[16352]: I0307 21:17:58.154537 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs","openshift-multus/multus-g6nmq","openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx","openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4","openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v","openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml","openshift-machine-config-operator/machine-config-server-xskwx","openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f","openshift-ovn-kubernetes/ovnkube-node-x9v76","openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw","openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x","openshift-machine-config-operator/machine-config-daemon-kp74q","openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg","openshift-network-node-identity/network-node-identity-kpsm4","openshift-marketplace/redhat-operators-fdltd","openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx","kube-system/bootstrap-kube-controller-manager-master-0","openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh","openshift-ingress-operator/ingress-operator-677db989d6-tklw9","openshift-etcd/etcd-master-0","openshift-kube-apiserver/bootstrap-kube-apiserver-master-0","openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b","openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz","openshift-multus/multus-additional-cni-plugins-xf7kg","openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6","openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74","openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h","openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h","openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd","openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb","openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr","openshift-network-operator/iptables-alerter-n8nz9","openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l","openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5","openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0","openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft","assisted-installer/assisted-installer-controller-mqwls","openshift-kube-apiserver/installer-1-master-0","openshift-marketplace/redhat-marketplace-z2cc9","openshift-network-diagnostics/network-check-target-fr4qr","openshift-etcd/installer-1-master-0","openshift-kube-apiserver/kube-apiserver-master-0","openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr","openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k","openshift-cluster-node-tuning-operator/tuned-qzjmv","openshift-controller-manager/controller-manager-86d86fcf49-hgbkg","openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m","openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn","openshift-dns/node-resolver-zhkfm","openshift-ingress/router-default-79f8cd6fdd-858hg","openshift-kube-apiserver/installer-1-retry-1-master-0","openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4","openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc","openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j","openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7","openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk","openshift-kube-scheduler/installer-4-master-0","openshift-marketplace/community-operators-rw59s","openshift-kube-controller-manager/installer-2-master-0","openshift-service-ca/service-ca-84bfdbbb7f-h76wh","openshift-apiserver/apiserver-694d775589-btnh4","openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf","openshift-dns/dns-default-hm77f","openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp","openshift-config-operator/openshift-config-operator-64488f9d78-cb227","openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q","openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz","openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh","openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp","openshift-dns-operator/dns-operator-589895fbb7-wqqqr","openshift-insights/insights-operator-8f89dfddd-rlx9x","openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8","openshift-marketplace/certified-operators-vxpb5","openshift-multus/network-metrics-daemon-l2bdp","openshift-network-operator/network-operator-7c649bf6d4-v4xm9","openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz","kube-system/bootstrap-kube-scheduler-master-0","openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9","openshift-machine-config-operator/kube-rbac-proxy-crio-master-0"] Mar 07 21:17:58.155143 master-0 kubenswrapper[16352]: I0307 21:17:58.154892 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="assisted-installer/assisted-installer-controller-mqwls" Mar 07 21:17:58.162300 master-0 kubenswrapper[16352]: I0307 21:17:58.161352 16352 kubelet.go:2566] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dccc88d9-f88f-4d19-bad9-7cccf7e5a543" Mar 07 21:17:58.164819 master-0 kubenswrapper[16352]: I0307 21:17:58.164233 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 07 21:17:58.166751 master-0 kubenswrapper[16352]: I0307 21:17:58.166669 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 21:17:58.167091 master-0 kubenswrapper[16352]: I0307 21:17:58.167062 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 21:17:58.191391 master-0 kubenswrapper[16352]: I0307 21:17:58.184748 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.191391 master-0 kubenswrapper[16352]: I0307 21:17:58.187256 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-1-master-0" Mar 07 21:17:58.191664 master-0 kubenswrapper[16352]: I0307 21:17:58.191539 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 07 21:17:58.191944 master-0 kubenswrapper[16352]: I0307 21:17:58.191888 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 21:17:58.192337 master-0 kubenswrapper[16352]: I0307 21:17:58.192283 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 07 21:17:58.192395 master-0 kubenswrapper[16352]: I0307 21:17:58.192350 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 07 21:17:58.193124 master-0 kubenswrapper[16352]: I0307 21:17:58.193067 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 21:17:58.193409 master-0 kubenswrapper[16352]: I0307 21:17:58.193382 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.193739 master-0 kubenswrapper[16352]: I0307 21:17:58.193426 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 21:17:58.195204 master-0 kubenswrapper[16352]: I0307 21:17:58.195156 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 21:17:58.195503 master-0 kubenswrapper[16352]: I0307 21:17:58.195456 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.200006 master-0 kubenswrapper[16352]: I0307 21:17:58.199944 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-master-0" Mar 07 21:17:58.200006 master-0 kubenswrapper[16352]: I0307 21:17:58.199996 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.200407 master-0 kubenswrapper[16352]: I0307 21:17:58.200380 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 21:17:58.200521 master-0 kubenswrapper[16352]: I0307 21:17:58.200473 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.200626 master-0 kubenswrapper[16352]: I0307 21:17:58.200607 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 21:17:58.200739 master-0 kubenswrapper[16352]: I0307 21:17:58.200404 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.200907 master-0 kubenswrapper[16352]: I0307 21:17:58.200862 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 21:17:58.200960 master-0 kubenswrapper[16352]: I0307 21:17:58.200603 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 21:17:58.201128 master-0 kubenswrapper[16352]: I0307 21:17:58.201104 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 21:17:58.201188 master-0 kubenswrapper[16352]: I0307 21:17:58.200981 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 07 21:17:58.201188 master-0 kubenswrapper[16352]: I0307 21:17:58.201179 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 21:17:58.201520 master-0 kubenswrapper[16352]: I0307 21:17:58.201487 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.201574 master-0 kubenswrapper[16352]: I0307 21:17:58.201510 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 21:17:58.201706 master-0 kubenswrapper[16352]: I0307 21:17:58.201642 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 21:17:58.201766 master-0 kubenswrapper[16352]: I0307 21:17:58.201726 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.201766 master-0 kubenswrapper[16352]: I0307 21:17:58.201745 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-master-0" Mar 07 21:17:58.201948 master-0 kubenswrapper[16352]: I0307 21:17:58.201908 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 21:17:58.201948 master-0 kubenswrapper[16352]: I0307 21:17:58.201892 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 07 21:17:58.202249 master-0 kubenswrapper[16352]: I0307 21:17:58.202211 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 21:17:58.202371 master-0 kubenswrapper[16352]: I0307 21:17:58.202261 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-2-master-0" Mar 07 21:17:58.202426 master-0 kubenswrapper[16352]: I0307 21:17:58.202402 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 21:17:58.202650 master-0 kubenswrapper[16352]: I0307 21:17:58.202620 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 21:17:58.202996 master-0 kubenswrapper[16352]: I0307 21:17:58.202961 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 07 21:17:58.203328 master-0 kubenswrapper[16352]: I0307 21:17:58.203298 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 21:17:58.204272 master-0 kubenswrapper[16352]: I0307 21:17:58.204202 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 21:17:58.204440 master-0 kubenswrapper[16352]: I0307 21:17:58.204414 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 07 21:17:58.209274 master-0 kubenswrapper[16352]: I0307 21:17:58.204608 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 21:17:58.209274 master-0 kubenswrapper[16352]: I0307 21:17:58.207286 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 07 21:17:58.209274 master-0 kubenswrapper[16352]: I0307 21:17:58.208394 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 21:17:58.209274 master-0 kubenswrapper[16352]: I0307 21:17:58.208761 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 21:17:58.209274 master-0 kubenswrapper[16352]: I0307 21:17:58.209119 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 07 21:17:58.209815 master-0 kubenswrapper[16352]: I0307 21:17:58.209289 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.209815 master-0 kubenswrapper[16352]: I0307 21:17:58.209451 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 07 21:17:58.209815 master-0 kubenswrapper[16352]: I0307 21:17:58.209758 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 21:17:58.210075 master-0 kubenswrapper[16352]: I0307 21:17:58.210033 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 07 21:17:58.210126 master-0 kubenswrapper[16352]: I0307 21:17:58.210101 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 21:17:58.210373 master-0 kubenswrapper[16352]: I0307 21:17:58.210333 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 21:17:58.210460 master-0 kubenswrapper[16352]: I0307 21:17:58.210432 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210614 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210636 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210850 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210909 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210915 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210924 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211010 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211029 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211100 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211117 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211125 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211204 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211234 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211286 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211331 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211099 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211418 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211468 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211524 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211544 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211559 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211621 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211527 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211032 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211717 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211767 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211041 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211902 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211925 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211370 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211964 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.210939 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211418 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.211422 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 21:17:58.212217 master-0 kubenswrapper[16352]: I0307 21:17:58.212093 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.212114 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.212491 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.212801 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.212822 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.213194 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.213779 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.213997 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:17:58.216216 master-0 kubenswrapper[16352]: I0307 21:17:58.215692 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 21:17:58.216875 master-0 kubenswrapper[16352]: I0307 21:17:58.216551 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 07 21:17:58.216875 master-0 kubenswrapper[16352]: I0307 21:17:58.216777 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 21:17:58.217828 master-0 kubenswrapper[16352]: I0307 21:17:58.217378 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 07 21:17:58.222778 master-0 kubenswrapper[16352]: I0307 21:17:58.221416 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 21:17:58.225482 master-0 kubenswrapper[16352]: I0307 21:17:58.225412 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.248796 master-0 kubenswrapper[16352]: I0307 21:17:58.237129 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 07 21:17:58.248796 master-0 kubenswrapper[16352]: I0307 21:17:58.240778 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 07 21:17:58.248796 master-0 kubenswrapper[16352]: I0307 21:17:58.244491 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 21:17:58.248796 master-0 kubenswrapper[16352]: I0307 21:17:58.244565 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 21:17:58.252749 master-0 kubenswrapper[16352]: I0307 21:17:58.252483 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 21:17:58.253208 master-0 kubenswrapper[16352]: I0307 21:17:58.253128 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 21:17:58.254851 master-0 kubenswrapper[16352]: I0307 21:17:58.254803 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 21:17:58.255047 master-0 kubenswrapper[16352]: I0307 21:17:58.254981 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 21:17:58.256280 master-0 kubenswrapper[16352]: I0307 21:17:58.256247 16352 desired_state_of_world_populator.go:155] "Finished populating initial desired state of world" Mar 07 21:17:58.257868 master-0 kubenswrapper[16352]: I0307 21:17:58.257806 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 21:17:58.278319 master-0 kubenswrapper[16352]: I0307 21:17:58.278199 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 21:17:58.296729 master-0 kubenswrapper[16352]: I0307 21:17:58.296579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:17:58.296729 master-0 kubenswrapper[16352]: I0307 21:17:58.296632 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:58.296729 master-0 kubenswrapper[16352]: I0307 21:17:58.296661 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.296729 master-0 kubenswrapper[16352]: I0307 21:17:58.296756 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296786 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296891 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296920 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296946 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296940 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operand-assets\" (UniqueName: \"kubernetes.io/empty-dir/8269652e-360f-43ef-9e7d-473c5f478275-operand-assets\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.296969 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297043 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297092 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297129 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297164 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297191 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:17:58.297279 master-0 kubenswrapper[16352]: I0307 21:17:58.297297 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297333 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297365 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297377 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b88c5fbe-e19f-45b3-ab03-e1626f95776d-serving-cert\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297395 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297431 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwj77\" (UniqueName: \"kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297471 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297502 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297537 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297567 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297606 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297637 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297901 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-binary-copy\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297946 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.298024 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd633b72-3d0b-4601-a2c2-3f487d943b35-config\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.298058 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.297084 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-utilities\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:58.298423 master-0 kubenswrapper[16352]: I0307 21:17:58.298303 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298482 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-catalog-content\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298537 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e720291b-0f96-4ebb-80f2-5df7cb194ffc-package-server-manager-serving-cert\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298590 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-utilities\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298734 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f8c93e0d-54e5-4c80-9d69-a70317baeacf-trusted-ca\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298749 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-config\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298827 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298873 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298907 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298946 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.298975 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bqlq\" (UniqueName: \"kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299005 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299036 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299077 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299108 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299147 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299203 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299245 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299278 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299314 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299406 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299443 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299480 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2gv7\" (UniqueName: \"kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299518 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299551 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299587 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299622 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299660 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:17:58.299614 master-0 kubenswrapper[16352]: I0307 21:17:58.299707 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299752 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299795 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299836 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299865 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299908 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hqxlr\" (UniqueName: \"kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.299963 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300003 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300036 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zppz\" (UniqueName: \"kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz\") pod \"migrator-57ccdf9b5-5l6h9\" (UID: \"e38fc940-e59a-45ff-978b-fdcdc534a2a5\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300074 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300102 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300124 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300157 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300211 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300231 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300253 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300276 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300300 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300328 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300354 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tq99k\" (UniqueName: \"kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300540 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-ca-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.300836 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/8512a7f6-889f-483e-960f-1ce3c834e92c-snapshots\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301175 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-tmp\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301241 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e543d99f-e0dc-49be-95bd-c39eabd05ce8-serving-cert\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301273 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/47ecf172-666e-4360-97ff-bd9dbccc1fd6-trusted-ca\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301494 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5b339e6a-cae6-416a-963b-2fd23cecba96-serving-cert\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301821 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301910 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-catalog-content\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301950 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301972 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.301991 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.302053 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-daemon-config\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.302126 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.302162 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.302316 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2369ce94-237f-41ad-9875-173578764483-signing-key\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:17:58.302254 master-0 kubenswrapper[16352]: I0307 21:17:58.302323 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/f8980370-267c-4168-ba97-d780698533ff-metrics-tls\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302393 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302450 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3faedef9-d507-48aa-82a8-f3dc9b5adeef-config\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302447 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/fc392945-53ad-473c-8803-70e2026712d2-marketplace-trusted-ca\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302498 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302526 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-tmpfs\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302547 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302627 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302654 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302698 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302725 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nnk5\" (UniqueName: \"kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5\") pod \"csi-snapshot-controller-7577d6f48-kzjmp\" (UID: \"7fa7b789-9201-493e-a96d-484a2622301a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302748 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.302977 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303012 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303045 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rp45l\" (UniqueName: \"kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303123 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303165 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7nz\" (UniqueName: \"kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303196 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303225 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303253 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.303358 master-0 kubenswrapper[16352]: I0307 21:17:58.303375 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303407 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303433 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgkz\" (UniqueName: \"kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303444 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-service-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303461 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303482 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/fc392945-53ad-473c-8803-70e2026712d2-marketplace-operator-metrics\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303491 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303537 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303560 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303580 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303603 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303623 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmp5q\" (UniqueName: \"kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303713 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24f69689-ff12-4786-af05-61429e9eadf8-config\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303738 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303774 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cz8d\" (UniqueName: \"kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303951 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.303981 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.304004 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.304029 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.304053 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.304062 master-0 kubenswrapper[16352]: I0307 21:17:58.304065 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-serving-cert\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304165 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304206 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-client\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304205 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-config\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304243 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304343 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304401 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304450 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304566 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304593 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304640 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304663 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2b28m\" (UniqueName: \"kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304692 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304715 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304730 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-cache\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304745 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-serving-cert\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304744 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304815 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304852 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304887 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304928 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.304961 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305016 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305047 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305078 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305113 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305139 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-telemetry-config\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305146 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305179 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305263 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305310 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-tuning-operator-tls\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-node-tuning-operator-tls\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305357 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-drnv4\" (UniqueName: \"kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:58.305358 master-0 kubenswrapper[16352]: I0307 21:17:58.305408 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305455 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305467 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-config\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305500 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lbmm\" (UniqueName: \"kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305544 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305591 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305665 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305735 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305968 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-metrics-certs\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.305949 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306043 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/61a9fce6-50e1-413c-9ec0-177d6e903bdd-metrics-tls\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306076 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306136 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjt7j\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306172 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306199 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306231 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306282 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306310 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306342 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306370 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306405 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306439 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306491 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/abfb5602-7255-43d7-a510-e7f94885887e-serving-cert\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306502 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306718 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306752 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpjms\" (UniqueName: \"kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.306819 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307075 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b88c5fbe-e19f-45b3-ab03-e1626f95776d-config\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307118 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307147 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307193 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307221 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307249 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307277 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307305 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307414 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307451 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jcxp\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307478 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307595 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-service-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307649 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-tuned\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307749 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-catalog-content\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307742 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/b269ae2f-44ff-46c7-9039-21fca4a7a790-cni-binary-copy\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307850 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/183a5212-1b21-44e4-9ed5-2f63f76e652e-cache\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307911 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-srv-cert\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307950 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307945 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-baremetal-operator-tls\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cluster-baremetal-operator-tls\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.307974 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqtbf\" (UniqueName: \"kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308098 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-cert\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308128 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308231 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9mmg\" (UniqueName: \"kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308306 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308473 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308575 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308626 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308710 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308751 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308934 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.308996 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309031 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/f8c93e0d-54e5-4c80-9d69-a70317baeacf-apiservice-cert\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309038 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309109 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309147 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309166 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/29624e4f-d970-4dfa-a8f1-515b73397c8f-available-featuregates\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309178 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309234 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309300 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309337 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309504 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69jxd\" (UniqueName: \"kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309536 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-utilities\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309541 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f8mm9\" (UniqueName: \"kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309602 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:58.309811 master-0 kubenswrapper[16352]: I0307 21:17:58.309783 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/69851821-e1fc-44a8-98df-0cfe9d564126-srv-cert\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.309901 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.309973 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310012 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310058 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310116 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310172 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310218 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310296 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310339 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310377 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310419 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310465 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310510 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktjs9\" (UniqueName: \"kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310553 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310603 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310655 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310737 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310786 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310841 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310886 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310925 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.310964 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311040 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-image-registry-operator-tls\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311111 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311180 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311221 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311261 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311299 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311309 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-ca\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311338 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311377 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311411 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f08edf29-c53f-452d-880b-e8ce27b05b6f-catalog-content\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311421 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311458 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311500 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311589 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mzlv\" (UniqueName: \"kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311631 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.311992 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2369ce94-237f-41ad-9875-173578764483-signing-cabundle\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.313099 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/5f82d4aa-0cb5-477f-944e-745a21d124fc-etcd-client\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.313139 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5b339e6a-cae6-416a-963b-2fd23cecba96-config\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.313520 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e543d99f-e0dc-49be-95bd-c39eabd05ce8-config\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.315754 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.315814 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.315859 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.315904 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316084 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd633b72-3d0b-4601-a2c2-3f487d943b35-serving-cert\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29624e4f-d970-4dfa-a8f1-515b73397c8f-serving-cert\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316416 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/47ecf172-666e-4360-97ff-bd9dbccc1fd6-metrics-tls\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316096 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316537 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316574 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316619 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316726 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316765 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316784 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24f69689-ff12-4786-af05-61429e9eadf8-serving-cert\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-khdpn\" (UniqueName: \"kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.316934 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: E0307 21:17:58.317087 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"etcd-master-0\" already exists" pod="openshift-etcd/etcd-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: E0307 21:17:58.317296 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-controller-manager-master-0\" already exists" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: E0307 21:17:58.317525 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-startup-monitor-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.317649 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.317765 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-utilities\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.317871 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: E0307 21:17:58.317979 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"bootstrap-kube-scheduler-master-0\" already exists" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.318085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.318121 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.318199 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.318359 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-87fml\" (UniqueName: \"kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.318445 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.317353 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-images\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.319840 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-olm-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/8269652e-360f-43ef-9e7d-473c5f478275-cluster-olm-operator-serving-cert\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.319972 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320023 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320114 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320158 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320194 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320235 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320297 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-trusted-ca\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320445 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320492 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320534 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.320620 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: E0307 21:17:58.321052 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-apiserver-master-0\" already exists" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321279 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-trusted-ca-bundle\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321417 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321471 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321511 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wps6\" (UniqueName: \"kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6\") pod \"network-check-source-7c67b67d47-88mpr\" (UID: \"6d5765e6-80cc-404b-b375-c109febd1843\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321549 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321589 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321632 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpck7\" (UniqueName: \"kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321717 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321752 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321798 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.321840 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5n27m\" (UniqueName: \"kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.322805 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ds84\" (UniqueName: \"kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.322890 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.322952 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.322986 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323218 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323253 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323284 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323318 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323220 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.323885 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/abfb5602-7255-43d7-a510-e7f94885887e-config\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.324174 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalogserver-certs\" (UniqueName: \"kubernetes.io/secret/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-catalogserver-certs\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.324319 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"whereabouts-configmap\" (UniqueName: \"kubernetes.io/configmap/3caff2c1-f178-4e16-916d-27ccf178ff37-whereabouts-configmap\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.325310 master-0 kubenswrapper[16352]: I0307 21:17:58.324835 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3faedef9-d507-48aa-82a8-f3dc9b5adeef-serving-cert\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:17:58.335487 master-0 kubenswrapper[16352]: I0307 21:17:58.334366 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-script-lib\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.340013 master-0 kubenswrapper[16352]: I0307 21:17:58.339933 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 21:17:58.344768 master-0 kubenswrapper[16352]: I0307 21:17:58.344494 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/666475e5-df4b-44ef-a2d4-39d84ab91aad-iptables-alerter-script\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.357866 master-0 kubenswrapper[16352]: I0307 21:17:58.357801 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 21:17:58.360763 master-0 kubenswrapper[16352]: I0307 21:17:58.359106 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-ovnkube-config\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.369138 master-0 kubenswrapper[16352]: I0307 21:17:58.369083 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovnkube-config\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.378966 master-0 kubenswrapper[16352]: I0307 21:17:58.378908 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 21:17:58.382570 master-0 kubenswrapper[16352]: I0307 21:17:58.382468 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/420c6d8f-6313-4d6c-b817-420797fc6878-ovn-node-metrics-cert\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.398839 master-0 kubenswrapper[16352]: I0307 21:17:58.398715 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 21:17:58.419329 master-0 kubenswrapper[16352]: I0307 21:17:58.419140 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 21:17:58.423350 master-0 kubenswrapper[16352]: I0307 21:17:58.423254 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-serving-cert\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.424477 master-0 kubenswrapper[16352]: I0307 21:17:58.424391 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.424648 master-0 kubenswrapper[16352]: I0307 21:17:58.424617 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.424725 master-0 kubenswrapper[16352]: I0307 21:17:58.424529 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-slash\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.424725 master-0 kubenswrapper[16352]: I0307 21:17:58.424707 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-netns\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.424881 master-0 kubenswrapper[16352]: I0307 21:17:58.424859 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425087 master-0 kubenswrapper[16352]: I0307 21:17:58.424924 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.425087 master-0 kubenswrapper[16352]: I0307 21:17:58.425024 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-os-release\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425163 master-0 kubenswrapper[16352]: I0307 21:17:58.425128 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425163 master-0 kubenswrapper[16352]: I0307 21:17:58.425145 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-docker\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.425233 master-0 kubenswrapper[16352]: I0307 21:17:58.425223 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-system-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425351 master-0 kubenswrapper[16352]: I0307 21:17:58.425299 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.425499 master-0 kubenswrapper[16352]: I0307 21:17:58.425438 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.425499 master-0 kubenswrapper[16352]: I0307 21:17:58.425449 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-node-pullsecrets\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.425615 master-0 kubenswrapper[16352]: I0307 21:17:58.425561 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.425615 master-0 kubenswrapper[16352]: I0307 21:17:58.425592 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425747 master-0 kubenswrapper[16352]: I0307 21:17:58.425622 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-conf-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.425790 master-0 kubenswrapper[16352]: I0307 21:17:58.425762 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.425833 master-0 kubenswrapper[16352]: I0307 21:17:58.425801 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.425862 master-0 kubenswrapper[16352]: I0307 21:17:58.425850 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.425884 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.425914 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.425960 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.425974 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.426004 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/ca25117a-ccd5-4628-8342-e277bb7be0e2-host-etc-kube\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.426003 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.426060 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-etc-kubernetes\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.426081 master-0 kubenswrapper[16352]: I0307 21:17:58.426063 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-node-log\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426085 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/f8980370-267c-4168-ba97-d780698533ff-host-etc-kube\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426098 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426131 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426041 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-sys\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426174 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426373 master-0 kubenswrapper[16352]: I0307 21:17:58.426255 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-bin\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426529 master-0 kubenswrapper[16352]: I0307 21:17:58.426376 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.426529 master-0 kubenswrapper[16352]: I0307 21:17:58.426417 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426529 master-0 kubenswrapper[16352]: I0307 21:17:58.426447 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.426529 master-0 kubenswrapper[16352]: I0307 21:17:58.426470 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426529 master-0 kubenswrapper[16352]: I0307 21:17:58.426505 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.426828 master-0 kubenswrapper[16352]: I0307 21:17:58.426676 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-run\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.426828 master-0 kubenswrapper[16352]: I0307 21:17:58.426717 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426828 master-0 kubenswrapper[16352]: I0307 21:17:58.426737 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-log-socket\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.426828 master-0 kubenswrapper[16352]: I0307 21:17:58.426781 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-multus-certs\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.426828 master-0 kubenswrapper[16352]: I0307 21:17:58.426781 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.427029 master-0 kubenswrapper[16352]: I0307 21:17:58.426973 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.427029 master-0 kubenswrapper[16352]: I0307 21:17:58.426984 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-tuning-conf-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.427110 master-0 kubenswrapper[16352]: I0307 21:17:58.427045 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.427110 master-0 kubenswrapper[16352]: I0307 21:17:58.427070 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-os-release\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.427110 master-0 kubenswrapper[16352]: I0307 21:17:58.427077 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.427222 master-0 kubenswrapper[16352]: I0307 21:17:58.427143 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-system-cni-dir\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.427222 master-0 kubenswrapper[16352]: I0307 21:17:58.427155 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-multus\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.427288 master-0 kubenswrapper[16352]: I0307 21:17:58.427236 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-containers\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.427348 master-0 kubenswrapper[16352]: I0307 21:17:58.427324 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-kubelet\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.427394 master-0 kubenswrapper[16352]: I0307 21:17:58.427359 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.427467 master-0 kubenswrapper[16352]: I0307 21:17:58.427444 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.427513 master-0 kubenswrapper[16352]: I0307 21:17:58.427501 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.427591 master-0 kubenswrapper[16352]: I0307 21:17:58.427560 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.427703 master-0 kubenswrapper[16352]: I0307 21:17:58.427657 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.427860 master-0 kubenswrapper[16352]: I0307 21:17:58.427719 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.427860 master-0 kubenswrapper[16352]: I0307 21:17:58.427840 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:17:58.427970 master-0 kubenswrapper[16352]: I0307 21:17:58.427877 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-lib-modules\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.427970 master-0 kubenswrapper[16352]: I0307 21:17:58.427909 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.428054 master-0 kubenswrapper[16352]: I0307 21:17:58.427996 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-var-lib-kubelet\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.428054 master-0 kubenswrapper[16352]: I0307 21:17:58.428017 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-run-netns\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.428054 master-0 kubenswrapper[16352]: I0307 21:17:58.428035 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f2ca65f5-7dbe-4407-b38e-713592f62136-hosts-file\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:17:58.428262 master-0 kubenswrapper[16352]: I0307 21:17:58.428221 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-modprobe-d\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.428437 master-0 kubenswrapper[16352]: I0307 21:17:58.428385 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-kubelet\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.428490 master-0 kubenswrapper[16352]: I0307 21:17:58.428472 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.428771 master-0 kubenswrapper[16352]: I0307 21:17:58.428746 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.428908 master-0 kubenswrapper[16352]: I0307 21:17:58.428888 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.428954 master-0 kubenswrapper[16352]: I0307 21:17:58.428927 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.428954 master-0 kubenswrapper[16352]: I0307 21:17:58.428934 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-var-lib-cni-bin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.429102 master-0 kubenswrapper[16352]: I0307 21:17:58.429080 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.429137 master-0 kubenswrapper[16352]: I0307 21:17:58.429111 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429170 master-0 kubenswrapper[16352]: I0307 21:17:58.429147 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-kubernetes\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.429170 master-0 kubenswrapper[16352]: I0307 21:17:58.429157 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.429231 master-0 kubenswrapper[16352]: I0307 21:17:58.429159 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-cni-dir\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.429231 master-0 kubenswrapper[16352]: I0307 21:17:58.429184 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.429295 master-0 kubenswrapper[16352]: I0307 21:17:58.429238 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysconfig\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.429295 master-0 kubenswrapper[16352]: I0307 21:17:58.429210 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-systemd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429295 master-0 kubenswrapper[16352]: I0307 21:17:58.429207 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit-dir\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.429295 master-0 kubenswrapper[16352]: I0307 21:17:58.429185 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-ssl-certs\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.429402 master-0 kubenswrapper[16352]: I0307 21:17:58.429370 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429432 master-0 kubenswrapper[16352]: I0307 21:17:58.429404 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.429432 master-0 kubenswrapper[16352]: I0307 21:17:58.429420 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-systemd-units\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429432 master-0 kubenswrapper[16352]: I0307 21:17:58.429428 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.429516 master-0 kubenswrapper[16352]: I0307 21:17:58.429442 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/96cfa9d3-fc26-42e9-8bac-ff2c25223654-etc-cvo-updatepayloads\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:58.429516 master-0 kubenswrapper[16352]: I0307 21:17:58.429457 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.429516 master-0 kubenswrapper[16352]: I0307 21:17:58.429490 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-multus-socket-dir-parent\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.429592 master-0 kubenswrapper[16352]: I0307 21:17:58.429507 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429592 master-0 kubenswrapper[16352]: I0307 21:17:58.429537 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429592 master-0 kubenswrapper[16352]: I0307 21:17:58.429560 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/3caff2c1-f178-4e16-916d-27ccf178ff37-cnibin\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:17:58.429723 master-0 kubenswrapper[16352]: I0307 21:17:58.429600 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429723 master-0 kubenswrapper[16352]: I0307 21:17:58.429603 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-cni-netd\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429723 master-0 kubenswrapper[16352]: I0307 21:17:58.429604 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-etc-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429723 master-0 kubenswrapper[16352]: I0307 21:17:58.429633 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429851 master-0 kubenswrapper[16352]: I0307 21:17:58.429745 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.429851 master-0 kubenswrapper[16352]: I0307 21:17:58.429774 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.429851 master-0 kubenswrapper[16352]: I0307 21:17:58.429815 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429930 master-0 kubenswrapper[16352]: I0307 21:17:58.429860 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.429930 master-0 kubenswrapper[16352]: I0307 21:17:58.429872 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-containers\" (UniqueName: \"kubernetes.io/host-path/183a5212-1b21-44e4-9ed5-2f63f76e652e-etc-containers\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.429930 master-0 kubenswrapper[16352]: I0307 21:17:58.429915 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.430015 master-0 kubenswrapper[16352]: I0307 21:17:58.429914 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-cnibin\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.430015 master-0 kubenswrapper[16352]: I0307 21:17:58.429979 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.430069 master-0 kubenswrapper[16352]: I0307 21:17:58.430019 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-host-run-k8s-cni-cncf-io\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.430069 master-0 kubenswrapper[16352]: I0307 21:17:58.430060 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-var-lib-openvswitch\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.430191 master-0 kubenswrapper[16352]: I0307 21:17:58.430094 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-rootfs\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:58.430191 master-0 kubenswrapper[16352]: I0307 21:17:58.430104 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.430191 master-0 kubenswrapper[16352]: I0307 21:17:58.430117 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/420c6d8f-6313-4d6c-b817-420797fc6878-run-ovn\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.430191 master-0 kubenswrapper[16352]: I0307 21:17:58.430117 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/666475e5-df4b-44ef-a2d4-39d84ab91aad-host-slash\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:17:58.430191 master-0 kubenswrapper[16352]: I0307 21:17:58.430153 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.430360 master-0 kubenswrapper[16352]: I0307 21:17:58.430237 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.430360 master-0 kubenswrapper[16352]: I0307 21:17:58.430263 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.430360 master-0 kubenswrapper[16352]: I0307 21:17:58.430279 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-systemd\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.430360 master-0 kubenswrapper[16352]: I0307 21:17:58.430288 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.430360 master-0 kubenswrapper[16352]: I0307 21:17:58.430363 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.430504 master-0 kubenswrapper[16352]: I0307 21:17:58.430382 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.430504 master-0 kubenswrapper[16352]: I0307 21:17:58.430447 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-etc-sysctl-conf\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.430560 master-0 kubenswrapper[16352]: I0307 21:17:58.430523 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-dir\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.430560 master-0 kubenswrapper[16352]: I0307 21:17:58.430552 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/b269ae2f-44ff-46c7-9039-21fca4a7a790-hostroot\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:17:58.430618 master-0 kubenswrapper[16352]: I0307 21:17:58.430567 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-docker\" (UniqueName: \"kubernetes.io/host-path/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-etc-docker\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:17:58.430618 master-0 kubenswrapper[16352]: I0307 21:17:58.430605 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-host\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:17:58.440189 master-0 kubenswrapper[16352]: I0307 21:17:58.440136 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 21:17:58.443854 master-0 kubenswrapper[16352]: I0307 21:17:58.442938 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.445183 master-0 kubenswrapper[16352]: I0307 21:17:58.444996 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/46548c2c-6a8a-4382-87de-2c7a8442a33c-env-overrides\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.450446 master-0 kubenswrapper[16352]: I0307 21:17:58.450371 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/420c6d8f-6313-4d6c-b817-420797fc6878-env-overrides\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:17:58.456451 master-0 kubenswrapper[16352]: I0307 21:17:58.456353 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:58.461251 master-0 kubenswrapper[16352]: I0307 21:17:58.460943 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 21:17:58.492887 master-0 kubenswrapper[16352]: I0307 21:17:58.492700 16352 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 07 21:17:58.502065 master-0 kubenswrapper[16352]: I0307 21:17:58.495329 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 21:17:58.502065 master-0 kubenswrapper[16352]: I0307 21:17:58.500919 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-encryption-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.509455 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientMemory" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.509515 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasNoDiskPressure" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.509532 16352 kubelet_node_status.go:724] "Recording event message for node" node="master-0" event="NodeHasSufficientPID" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.510375 16352 kubelet_node_status.go:76] "Attempting to register node" node="master-0" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.511505 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 21:17:58.515078 master-0 kubenswrapper[16352]: I0307 21:17:58.514286 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-trusted-ca-bundle\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.523279 master-0 kubenswrapper[16352]: I0307 21:17:58.523234 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 21:17:58.528476 master-0 kubenswrapper[16352]: I0307 21:17:58.528438 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/27b149f7-6aff-45f3-b935-e65279f2f9ee-webhook-cert\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.531823 master-0 kubenswrapper[16352]: I0307 21:17:58.531755 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:17:58.531891 master-0 kubenswrapper[16352]: I0307 21:17:58.531818 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:58.532013 master-0 kubenswrapper[16352]: I0307 21:17:58.531981 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:17:58.532648 master-0 kubenswrapper[16352]: I0307 21:17:58.532618 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock" (OuterVolumeSpecName: "var-lock") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:17:58.533557 master-0 kubenswrapper[16352]: I0307 21:17:58.533524 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:58.533557 master-0 kubenswrapper[16352]: I0307 21:17:58.533554 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2357c135-5d09-4657-9038-48d25ed55b2d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:17:58.538106 master-0 kubenswrapper[16352]: I0307 21:17:58.538075 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 21:17:58.558901 master-0 kubenswrapper[16352]: I0307 21:17:58.557887 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 21:17:58.580091 master-0 kubenswrapper[16352]: I0307 21:17:58.579759 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 21:17:58.585434 master-0 kubenswrapper[16352]: I0307 21:17:58.585257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-audit\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.597484 master-0 kubenswrapper[16352]: I0307 21:17:58.597402 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 21:17:58.601130 master-0 kubenswrapper[16352]: I0307 21:17:58.601086 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-image-import-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.619665 master-0 kubenswrapper[16352]: I0307 21:17:58.619583 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 21:17:58.638608 master-0 kubenswrapper[16352]: I0307 21:17:58.638383 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 21:17:58.648126 master-0 kubenswrapper[16352]: I0307 21:17:58.648060 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-etcd-serving-ca\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.657627 master-0 kubenswrapper[16352]: I0307 21:17:58.657540 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 21:17:58.664646 master-0 kubenswrapper[16352]: I0307 21:17:58.664557 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:17:58.679543 master-0 kubenswrapper[16352]: I0307 21:17:58.679353 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 21:17:58.683505 master-0 kubenswrapper[16352]: I0307 21:17:58.683450 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-config\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:17:58.698590 master-0 kubenswrapper[16352]: I0307 21:17:58.698497 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 21:17:58.702628 master-0 kubenswrapper[16352]: I0307 21:17:58.702548 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-env-overrides\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.719842 master-0 kubenswrapper[16352]: I0307 21:17:58.719733 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 21:17:58.738492 master-0 kubenswrapper[16352]: I0307 21:17:58.738410 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 21:17:58.739963 master-0 kubenswrapper[16352]: I0307 21:17:58.739915 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/27b149f7-6aff-45f3-b935-e65279f2f9ee-ovnkube-identity-cm\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:17:58.759531 master-0 kubenswrapper[16352]: I0307 21:17:58.759175 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 21:17:58.762577 master-0 kubenswrapper[16352]: I0307 21:17:58.762519 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-serving-ca\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.777859 master-0 kubenswrapper[16352]: I0307 21:17:58.777660 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 21:17:58.799762 master-0 kubenswrapper[16352]: I0307 21:17:58.799652 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 21:17:58.803226 master-0 kubenswrapper[16352]: I0307 21:17:58.803177 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-etcd-client\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.817994 master-0 kubenswrapper[16352]: I0307 21:17:58.817913 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 21:17:58.825287 master-0 kubenswrapper[16352]: I0307 21:17:58.825218 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-encryption-config\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.838199 master-0 kubenswrapper[16352]: I0307 21:17:58.838141 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 21:17:58.840408 master-0 kubenswrapper[16352]: I0307 21:17:58.840363 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-audit-policies\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.857557 master-0 kubenswrapper[16352]: I0307 21:17:58.857457 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 21:17:58.863783 master-0 kubenswrapper[16352]: I0307 21:17:58.863706 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7d462ed3-d191-42a5-b8e0-79ab9af13991-trusted-ca-bundle\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.878124 master-0 kubenswrapper[16352]: I0307 21:17:58.878076 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 21:17:58.884834 master-0 kubenswrapper[16352]: I0307 21:17:58.884784 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/46548c2c-6a8a-4382-87de-2c7a8442a33c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:17:58.904804 master-0 kubenswrapper[16352]: I0307 21:17:58.899098 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 21:17:58.904804 master-0 kubenswrapper[16352]: I0307 21:17:58.902468 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7d462ed3-d191-42a5-b8e0-79ab9af13991-serving-cert\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:17:58.929515 master-0 kubenswrapper[16352]: I0307 21:17:58.929410 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 07 21:17:58.938044 master-0 kubenswrapper[16352]: I0307 21:17:58.937926 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-ca-certs\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:17:58.938191 master-0 kubenswrapper[16352]: I0307 21:17:58.938147 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 21:17:59.015050 master-0 kubenswrapper[16352]: I0307 21:17:58.957999 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 21:17:59.015050 master-0 kubenswrapper[16352]: I0307 21:17:58.978931 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 21:17:59.015050 master-0 kubenswrapper[16352]: I0307 21:17:58.983900 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-metrics-tls\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:17:59.015050 master-0 kubenswrapper[16352]: I0307 21:17:58.998670 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 21:17:59.015050 master-0 kubenswrapper[16352]: I0307 21:17:59.005429 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-config-volume\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:17:59.018960 master-0 kubenswrapper[16352]: I0307 21:17:59.018879 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 21:17:59.037801 master-0 kubenswrapper[16352]: I0307 21:17:59.037442 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 21:17:59.059147 master-0 kubenswrapper[16352]: I0307 21:17:59.059069 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 21:17:59.078959 master-0 kubenswrapper[16352]: I0307 21:17:59.078894 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 21:17:59.082104 master-0 kubenswrapper[16352]: I0307 21:17:59.082035 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d50f92ea-1c78-4535-a14c-96b00f2cf377-service-ca-bundle\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:59.097720 master-0 kubenswrapper[16352]: I0307 21:17:59.097643 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 21:17:59.104671 master-0 kubenswrapper[16352]: I0307 21:17:59.104615 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-default-certificate\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:59.118048 master-0 kubenswrapper[16352]: I0307 21:17:59.117961 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 21:17:59.132029 master-0 kubenswrapper[16352]: I0307 21:17:59.131946 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:17:59.138574 master-0 kubenswrapper[16352]: I0307 21:17:59.138514 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 21:17:59.141573 master-0 kubenswrapper[16352]: I0307 21:17:59.141500 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96cfa9d3-fc26-42e9-8bac-ff2c25223654-serving-cert\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:59.159733 master-0 kubenswrapper[16352]: I0307 21:17:59.157782 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 21:17:59.165402 master-0 kubenswrapper[16352]: I0307 21:17:59.165357 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/96cfa9d3-fc26-42e9-8bac-ff2c25223654-service-ca\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:17:59.178883 master-0 kubenswrapper[16352]: I0307 21:17:59.178800 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 21:17:59.195663 master-0 kubenswrapper[16352]: I0307 21:17:59.195496 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 07 21:17:59.195852 master-0 kubenswrapper[16352]: I0307 21:17:59.195821 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-stats-auth\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:59.197903 master-0 kubenswrapper[16352]: I0307 21:17:59.197868 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:17:59.216056 master-0 kubenswrapper[16352]: I0307 21:17:59.216005 16352 request.go:700] Waited for 1.013017403s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-controller-manager/secrets?fieldSelector=metadata.name%3Dopenshift-controller-manager-sa-dockercfg-l888p&limit=500&resourceVersion=0 Mar 07 21:17:59.219199 master-0 kubenswrapper[16352]: I0307 21:17:59.219033 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l888p" Mar 07 21:17:59.237596 master-0 kubenswrapper[16352]: I0307 21:17:59.237521 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:17:59.245074 master-0 kubenswrapper[16352]: I0307 21:17:59.244958 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:59.263807 master-0 kubenswrapper[16352]: I0307 21:17:59.257849 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:17:59.264225 master-0 kubenswrapper[16352]: I0307 21:17:59.264173 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:59.279078 master-0 kubenswrapper[16352]: I0307 21:17:59.279006 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 21:17:59.285399 master-0 kubenswrapper[16352]: I0307 21:17:59.285355 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d50f92ea-1c78-4535-a14c-96b00f2cf377-metrics-certs\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:17:59.297904 master-0 kubenswrapper[16352]: E0307 21:17:59.297861 16352 secret.go:189] Couldn't get secret openshift-cluster-machine-approver/machine-approver-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.298073 master-0 kubenswrapper[16352]: E0307 21:17:59.297957 16352 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/client-ca: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.298171 master-0 kubenswrapper[16352]: E0307 21:17:59.297984 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls podName:e3fe386a-dea8-484a-b95a-0f3f475b1f82 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.797957546 +0000 UTC m=+2.868662605 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-approver-tls" (UniqueName: "kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls") pod "machine-approver-754bdc9f9d-bbz7l" (UID: "e3fe386a-dea8-484a-b95a-0f3f475b1f82") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.298171 master-0 kubenswrapper[16352]: E0307 21:17:59.298126 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca podName:7bac1b9e-53bc-46e9-ba12-2eb0f2d09907 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.798101689 +0000 UTC m=+2.868806748 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "client-ca" (UniqueName: "kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca") pod "route-controller-manager-cdf659ffc-4969h" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.298171 master-0 kubenswrapper[16352]: I0307 21:17:59.298154 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:17:59.299384 master-0 kubenswrapper[16352]: E0307 21:17:59.299335 16352 configmap.go:193] Couldn't get configMap openshift-cloud-credential-operator/cco-trusted-ca: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.299487 master-0 kubenswrapper[16352]: E0307 21:17:59.299447 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca podName:85bb04ed-e2d1-496d-8f2c-9555bb3c5d78 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.799413371 +0000 UTC m=+2.870118470 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cco-trusted-ca" (UniqueName: "kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca") pod "cloud-credential-operator-55d85b7b47-7tb74" (UID: "85bb04ed-e2d1-496d-8f2c-9555bb3c5d78") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.300256 master-0 kubenswrapper[16352]: I0307 21:17:59.300181 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:59.301282 master-0 kubenswrapper[16352]: E0307 21:17:59.301252 16352 secret.go:189] Couldn't get secret openshift-machine-config-operator/proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.301359 master-0 kubenswrapper[16352]: E0307 21:17:59.301305 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls podName:655b9f0a-cf27-443d-b0ea-3642dcae1ad2 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.801295356 +0000 UTC m=+2.872000415 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls") pod "machine-config-daemon-kp74q" (UID: "655b9f0a-cf27-443d-b0ea-3642dcae1ad2") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.301359 master-0 kubenswrapper[16352]: E0307 21:17:59.301305 16352 secret.go:189] Couldn't get secret openshift-cloud-controller-manager-operator/cloud-controller-manager-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.301469 master-0 kubenswrapper[16352]: E0307 21:17:59.301404 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls podName:ca25117a-ccd5-4628-8342-e277bb7be0e2 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.801381778 +0000 UTC m=+2.872086867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-controller-manager-operator-tls" (UniqueName: "kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls") pod "cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" (UID: "ca25117a-ccd5-4628-8342-e277bb7be0e2") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.302613 master-0 kubenswrapper[16352]: E0307 21:17:59.302534 16352 secret.go:189] Couldn't get secret openshift-cloud-credential-operator/cloud-credential-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.302613 master-0 kubenswrapper[16352]: E0307 21:17:59.302571 16352 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy-cluster-autoscaler-operator: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.302867 master-0 kubenswrapper[16352]: E0307 21:17:59.302630 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert podName:85bb04ed-e2d1-496d-8f2c-9555bb3c5d78 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.802609308 +0000 UTC m=+2.873314397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cloud-credential-operator-serving-cert" (UniqueName: "kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert") pod "cloud-credential-operator-55d85b7b47-7tb74" (UID: "85bb04ed-e2d1-496d-8f2c-9555bb3c5d78") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.302867 master-0 kubenswrapper[16352]: E0307 21:17:59.302583 16352 secret.go:189] Couldn't get secret openshift-machine-api/machine-api-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.302867 master-0 kubenswrapper[16352]: E0307 21:17:59.302777 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config podName:b12701eb-4226-4f9c-9398-ad0c3fea7451 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.802733731 +0000 UTC m=+2.873438950 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config") pod "cluster-autoscaler-operator-69576476f7-dqvvb" (UID: "b12701eb-4226-4f9c-9398-ad0c3fea7451") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.302867 master-0 kubenswrapper[16352]: E0307 21:17:59.302822 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls podName:46d1b044-16fb-4442-a554-6b15a8a1c8ae nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.802805192 +0000 UTC m=+2.873510501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "machine-api-operator-tls" (UniqueName: "kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls") pod "machine-api-operator-84bf6db4f9-t8jw4" (UID: "46d1b044-16fb-4442-a554-6b15a8a1c8ae") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.302867 master-0 kubenswrapper[16352]: E0307 21:17:59.302869 16352 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/cloud-controller-manager-images: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.303088 master-0 kubenswrapper[16352]: E0307 21:17:59.302936 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images podName:ca25117a-ccd5-4628-8342-e277bb7be0e2 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.802922405 +0000 UTC m=+2.873627464 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images") pod "cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" (UID: "ca25117a-ccd5-4628-8342-e277bb7be0e2") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.304144 master-0 kubenswrapper[16352]: E0307 21:17:59.304088 16352 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.304144 master-0 kubenswrapper[16352]: E0307 21:17:59.304094 16352 configmap.go:193] Couldn't get configMap openshift-insights/service-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.304144 master-0 kubenswrapper[16352]: E0307 21:17:59.304127 16352 configmap.go:193] Couldn't get configMap openshift-machine-api/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.304325 master-0 kubenswrapper[16352]: E0307 21:17:59.304149 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert podName:c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.804137145 +0000 UTC m=+2.874842424 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-cert" (UniqueName: "kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert") pod "packageserver-f5bf97fcc-w82vx" (UID: "c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.304325 master-0 kubenswrapper[16352]: E0307 21:17:59.304178 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle podName:8512a7f6-889f-483e-960f-1ce3c834e92c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.804158605 +0000 UTC m=+2.874863694 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle") pod "insights-operator-8f89dfddd-rlx9x" (UID: "8512a7f6-889f-483e-960f-1ce3c834e92c") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.304325 master-0 kubenswrapper[16352]: E0307 21:17:59.304215 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config podName:46d1b044-16fb-4442-a554-6b15a8a1c8ae nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.804199166 +0000 UTC m=+2.874904265 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config") pod "machine-api-operator-84bf6db4f9-t8jw4" (UID: "46d1b044-16fb-4442-a554-6b15a8a1c8ae") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305315 master-0 kubenswrapper[16352]: E0307 21:17:59.305286 16352 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305367 master-0 kubenswrapper[16352]: E0307 21:17:59.305335 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config podName:655b9f0a-cf27-443d-b0ea-3642dcae1ad2 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805325893 +0000 UTC m=+2.876030952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcd-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config") pod "machine-config-daemon-kp74q" (UID: "655b9f0a-cf27-443d-b0ea-3642dcae1ad2") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305456 16352 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305478 16352 configmap.go:193] Couldn't get configMap openshift-insights/trusted-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305492 16352 secret.go:189] Couldn't get secret openshift-machine-config-operator/machine-config-server-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305478 16352 configmap.go:193] Couldn't get configMap openshift-cloud-controller-manager-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305528 16352 secret.go:189] Couldn't get secret openshift-machine-config-operator/node-bootstrapper-token: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305566 16352 configmap.go:193] Couldn't get configMap openshift-machine-api/machine-api-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305582 16352 secret.go:189] Couldn't get secret openshift-cluster-storage-operator/cluster-storage-operator-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.305628 master-0 kubenswrapper[16352]: E0307 21:17:59.305596 16352 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/machine-config-operator-images: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305540 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config podName:7f69a884-5fe8-4c03-8258-ff35396efc30 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805520708 +0000 UTC m=+2.876225797 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config") pod "machine-config-operator-fdb5c78b5-rk7q8" (UID: "7f69a884-5fe8-4c03-8258-ff35396efc30") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305542 16352 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305745 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert podName:bd9cf577-3c49-417b-a6c0-9d307c113221 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805714973 +0000 UTC m=+2.876420082 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-storage-operator-serving-cert" (UniqueName: "kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert") pod "cluster-storage-operator-6fbfc8dc8f-v48jn" (UID: "bd9cf577-3c49-417b-a6c0-9d307c113221") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305863 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle podName:8512a7f6-889f-483e-960f-1ce3c834e92c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805777544 +0000 UTC m=+2.876482893 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle") pod "insights-operator-8f89dfddd-rlx9x" (UID: "8512a7f6-889f-483e-960f-1ce3c834e92c") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305880 16352 secret.go:189] Couldn't get secret openshift-insights/openshift-insights-serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305922 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs podName:599c055c-3517-46cb-b584-0050b12a7dea nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805901967 +0000 UTC m=+2.876607296 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certs" (UniqueName: "kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs") pod "machine-config-server-xskwx" (UID: "599c055c-3517-46cb-b584-0050b12a7dea") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.305972 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config podName:ca25117a-ccd5-4628-8342-e277bb7be0e2 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805954748 +0000 UTC m=+2.876659847 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config") pod "cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" (UID: "ca25117a-ccd5-4628-8342-e277bb7be0e2") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.306014 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images podName:46d1b044-16fb-4442-a554-6b15a8a1c8ae nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.805998009 +0000 UTC m=+2.876703358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images") pod "machine-api-operator-84bf6db4f9-t8jw4" (UID: "46d1b044-16fb-4442-a554-6b15a8a1c8ae") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.306055 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token podName:599c055c-3517-46cb-b584-0050b12a7dea nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.80603809 +0000 UTC m=+2.876743199 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-bootstrap-token" (UniqueName: "kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token") pod "machine-config-server-xskwx" (UID: "599c055c-3517-46cb-b584-0050b12a7dea") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.306084 master-0 kubenswrapper[16352]: E0307 21:17:59.306083 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images podName:7f69a884-5fe8-4c03-8258-ff35396efc30 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.806070451 +0000 UTC m=+2.876775550 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "images" (UniqueName: "kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images") pod "machine-config-operator-fdb5c78b5-rk7q8" (UID: "7f69a884-5fe8-4c03-8258-ff35396efc30") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.306126 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config podName:e3fe386a-dea8-484a-b95a-0f3f475b1f82 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.806109482 +0000 UTC m=+2.876814581 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "auth-proxy-config" (UniqueName: "kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config") pod "machine-approver-754bdc9f9d-bbz7l" (UID: "e3fe386a-dea8-484a-b95a-0f3f475b1f82") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.306168 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert podName:8512a7f6-889f-483e-960f-1ce3c834e92c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.806151733 +0000 UTC m=+2.876856842 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert") pod "insights-operator-8f89dfddd-rlx9x" (UID: "8512a7f6-889f-483e-960f-1ce3c834e92c") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.308483 16352 configmap.go:193] Couldn't get configMap openshift-route-controller-manager/config: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.308558 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config podName:7bac1b9e-53bc-46e9-ba12-2eb0f2d09907 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.808540471 +0000 UTC m=+2.879245570 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config") pod "route-controller-manager-cdf659ffc-4969h" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.308595 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.308713 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates podName:9515e34b-addf-487a-adf8-c6ef24fcc54c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.808668024 +0000 UTC m=+2.879373113 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates") pod "prometheus-operator-admission-webhook-8464df8497-lxzml" (UID: "9515e34b-addf-487a-adf8-c6ef24fcc54c") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.309174 16352 configmap.go:193] Couldn't get configMap openshift-cluster-machine-approver/machine-approver-config: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.309289 master-0 kubenswrapper[16352]: E0307 21:17:59.309239 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config podName:e3fe386a-dea8-484a-b95a-0f3f475b1f82 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.809227668 +0000 UTC m=+2.879932727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "config" (UniqueName: "kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config") pod "machine-approver-754bdc9f9d-bbz7l" (UID: "e3fe386a-dea8-484a-b95a-0f3f475b1f82") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.309765 16352 configmap.go:193] Couldn't get configMap openshift-controller-manager/openshift-global-ca: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.309848 16352 configmap.go:193] Couldn't get configMap openshift-machine-config-operator/kube-rbac-proxy: failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.309921 16352 secret.go:189] Couldn't get secret openshift-route-controller-manager/serving-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.309881 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles podName:6deed9a9-6702-4177-a35d-58ad9930a893 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.809862123 +0000 UTC m=+2.880567402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-ca-bundles" (UniqueName: "kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles") pod "controller-manager-86d86fcf49-hgbkg" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.309978 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config podName:5446df8b-23d4-4bf3-84ac-d8e1d18813af nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.809962095 +0000 UTC m=+2.880667164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "mcc-auth-proxy-config" (UniqueName: "kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config") pod "machine-config-controller-ff46b7bdf-55p6v" (UID: "5446df8b-23d4-4bf3-84ac-d8e1d18813af") : failed to sync configmap cache: timed out waiting for the condition Mar 07 21:17:59.310022 master-0 kubenswrapper[16352]: E0307 21:17:59.310003 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert podName:7bac1b9e-53bc-46e9-ba12-2eb0f2d09907 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.809990926 +0000 UTC m=+2.880695995 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "serving-cert" (UniqueName: "kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert") pod "route-controller-manager-cdf659ffc-4969h" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.312240 master-0 kubenswrapper[16352]: E0307 21:17:59.312195 16352 secret.go:189] Couldn't get secret openshift-machine-api/cluster-autoscaler-operator-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.312304 master-0 kubenswrapper[16352]: E0307 21:17:59.312236 16352 secret.go:189] Couldn't get secret openshift-operator-lifecycle-manager/packageserver-service-cert: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.312304 master-0 kubenswrapper[16352]: E0307 21:17:59.312280 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert podName:b12701eb-4226-4f9c-9398-ad0c3fea7451 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.812262041 +0000 UTC m=+2.882967130 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert") pod "cluster-autoscaler-operator-69576476f7-dqvvb" (UID: "b12701eb-4226-4f9c-9398-ad0c3fea7451") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.312434 master-0 kubenswrapper[16352]: E0307 21:17:59.312318 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert podName:c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.812298682 +0000 UTC m=+2.883003781 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "apiservice-cert" (UniqueName: "kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert") pod "packageserver-f5bf97fcc-w82vx" (UID: "c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.314508 master-0 kubenswrapper[16352]: E0307 21:17:59.314419 16352 secret.go:189] Couldn't get secret openshift-machine-config-operator/mco-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.314508 master-0 kubenswrapper[16352]: E0307 21:17:59.314484 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls podName:7f69a884-5fe8-4c03-8258-ff35396efc30 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.814469403 +0000 UTC m=+2.885174482 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls") pod "machine-config-operator-fdb5c78b5-rk7q8" (UID: "7f69a884-5fe8-4c03-8258-ff35396efc30") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.319140 master-0 kubenswrapper[16352]: E0307 21:17:59.318953 16352 secret.go:189] Couldn't get secret openshift-machine-config-operator/mcc-proxy-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.319140 master-0 kubenswrapper[16352]: E0307 21:17:59.319036 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls podName:5446df8b-23d4-4bf3-84ac-d8e1d18813af nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.819015724 +0000 UTC m=+2.889720983 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "proxy-tls" (UniqueName: "kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls") pod "machine-config-controller-ff46b7bdf-55p6v" (UID: "5446df8b-23d4-4bf3-84ac-d8e1d18813af") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.319140 master-0 kubenswrapper[16352]: E0307 21:17:59.319064 16352 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.319140 master-0 kubenswrapper[16352]: E0307 21:17:59.319112 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls podName:c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021 nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.819098356 +0000 UTC m=+2.889803445 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls") pod "cluster-samples-operator-664cb58b85-fmzk7" (UID: "c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.324736 master-0 kubenswrapper[16352]: E0307 21:17:59.324693 16352 secret.go:189] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.324736 master-0 kubenswrapper[16352]: E0307 21:17:59.324754 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls podName:1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c nodeName:}" failed. No retries permitted until 2026-03-07 21:17:59.824744052 +0000 UTC m=+2.895449111 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-6686554ddc-dgjgz" (UID: "1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c") : failed to sync secret cache: timed out waiting for the condition Mar 07 21:17:59.337577 master-0 kubenswrapper[16352]: I0307 21:17:59.332335 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:17:59.341097 master-0 kubenswrapper[16352]: I0307 21:17:59.340824 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:17:59.358275 master-0 kubenswrapper[16352]: I0307 21:17:59.358178 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h286h" Mar 07 21:17:59.380162 master-0 kubenswrapper[16352]: I0307 21:17:59.379873 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:17:59.399076 master-0 kubenswrapper[16352]: I0307 21:17:59.398917 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:17:59.420503 master-0 kubenswrapper[16352]: I0307 21:17:59.420423 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:17:59.438216 master-0 kubenswrapper[16352]: I0307 21:17:59.437947 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:17:59.451496 master-0 kubenswrapper[16352]: I0307 21:17:59.451345 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:17:59.457466 master-0 kubenswrapper[16352]: I0307 21:17:59.457417 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:17:59.479514 master-0 kubenswrapper[16352]: I0307 21:17:59.479448 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 07 21:17:59.498531 master-0 kubenswrapper[16352]: I0307 21:17:59.498473 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-v8df8" Mar 07 21:17:59.518487 master-0 kubenswrapper[16352]: I0307 21:17:59.518443 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 21:17:59.538829 master-0 kubenswrapper[16352]: I0307 21:17:59.538775 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-df95k" Mar 07 21:17:59.559060 master-0 kubenswrapper[16352]: I0307 21:17:59.559011 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-kd5ps" Mar 07 21:17:59.577714 master-0 kubenswrapper[16352]: I0307 21:17:59.577597 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-lj4zb" Mar 07 21:17:59.598278 master-0 kubenswrapper[16352]: I0307 21:17:59.598199 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2tlv4" Mar 07 21:17:59.634169 master-0 kubenswrapper[16352]: I0307 21:17:59.634049 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 07 21:17:59.638960 master-0 kubenswrapper[16352]: I0307 21:17:59.638889 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 07 21:17:59.659373 master-0 kubenswrapper[16352]: I0307 21:17:59.659266 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-b6gqw" Mar 07 21:17:59.678299 master-0 kubenswrapper[16352]: I0307 21:17:59.678202 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 07 21:17:59.699021 master-0 kubenswrapper[16352]: I0307 21:17:59.698916 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lbvsg" Mar 07 21:17:59.719317 master-0 kubenswrapper[16352]: I0307 21:17:59.719118 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 07 21:17:59.737234 master-0 kubenswrapper[16352]: I0307 21:17:59.737135 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 07 21:17:59.763284 master-0 kubenswrapper[16352]: I0307 21:17:59.763147 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 07 21:17:59.777485 master-0 kubenswrapper[16352]: I0307 21:17:59.777393 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-7fv8q" Mar 07 21:17:59.798588 master-0 kubenswrapper[16352]: I0307 21:17:59.798317 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 21:17:59.819866 master-0 kubenswrapper[16352]: I0307 21:17:59.818563 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 21:17:59.838026 master-0 kubenswrapper[16352]: I0307 21:17:59.837945 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 21:17:59.857247 master-0 kubenswrapper[16352]: I0307 21:17:59.857197 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 07 21:17:59.872355 master-0 kubenswrapper[16352]: I0307 21:17:59.872265 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:59.872451 master-0 kubenswrapper[16352]: I0307 21:17:59.872402 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.872819 master-0 kubenswrapper[16352]: I0307 21:17:59.872769 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:59.872932 master-0 kubenswrapper[16352]: I0307 21:17:59.872895 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:59.873087 master-0 kubenswrapper[16352]: I0307 21:17:59.873040 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.873147 master-0 kubenswrapper[16352]: I0307 21:17:59.873114 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:59.873357 master-0 kubenswrapper[16352]: I0307 21:17:59.873308 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:59.873401 master-0 kubenswrapper[16352]: I0307 21:17:59.873114 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.873401 master-0 kubenswrapper[16352]: I0307 21:17:59.873384 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:59.873496 master-0 kubenswrapper[16352]: I0307 21:17:59.873455 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/46d1b044-16fb-4442-a554-6b15a8a1c8ae-machine-api-operator-tls\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.873596 master-0 kubenswrapper[16352]: I0307 21:17:59.873555 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:59.873740 master-0 kubenswrapper[16352]: I0307 21:17:59.873675 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:59.873992 master-0 kubenswrapper[16352]: I0307 21:17:59.873950 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.874085 master-0 kubenswrapper[16352]: I0307 21:17:59.874049 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.874321 master-0 kubenswrapper[16352]: I0307 21:17:59.874274 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:59.874366 master-0 kubenswrapper[16352]: I0307 21:17:59.874315 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-service-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.874366 master-0 kubenswrapper[16352]: I0307 21:17:59.874350 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:17:59.874441 master-0 kubenswrapper[16352]: I0307 21:17:59.874406 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.874504 master-0 kubenswrapper[16352]: I0307 21:17:59.874468 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:59.874594 master-0 kubenswrapper[16352]: I0307 21:17:59.874562 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:59.874660 master-0 kubenswrapper[16352]: I0307 21:17:59.874629 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.874755 master-0 kubenswrapper[16352]: I0307 21:17:59.874729 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-images\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.874806 master-0 kubenswrapper[16352]: I0307 21:17:59.874733 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:17:59.874850 master-0 kubenswrapper[16352]: I0307 21:17:59.874467 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/46d1b044-16fb-4442-a554-6b15a8a1c8ae-config\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:17:59.874850 master-0 kubenswrapper[16352]: I0307 21:17:59.874820 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:17:59.874924 master-0 kubenswrapper[16352]: I0307 21:17:59.874883 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:59.875036 master-0 kubenswrapper[16352]: I0307 21:17:59.874989 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cluster-storage-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/bd9cf577-3c49-417b-a6c0-9d307c113221-cluster-storage-operator-serving-cert\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:17:59.875092 master-0 kubenswrapper[16352]: I0307 21:17:59.875020 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:59.875132 master-0 kubenswrapper[16352]: I0307 21:17:59.875086 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/8512a7f6-889f-483e-960f-1ce3c834e92c-trusted-ca-bundle\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.875132 master-0 kubenswrapper[16352]: I0307 21:17:59.875119 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.875205 master-0 kubenswrapper[16352]: I0307 21:17:59.875167 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:17:59.875349 master-0 kubenswrapper[16352]: I0307 21:17:59.875325 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.875439 master-0 kubenswrapper[16352]: I0307 21:17:59.875413 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8512a7f6-889f-483e-960f-1ce3c834e92c-serving-cert\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:17:59.875993 master-0 kubenswrapper[16352]: I0307 21:17:59.875937 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:59.875993 master-0 kubenswrapper[16352]: I0307 21:17:59.875972 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/9515e34b-addf-487a-adf8-c6ef24fcc54c-tls-certificates\") pod \"prometheus-operator-admission-webhook-8464df8497-lxzml\" (UID: \"9515e34b-addf-487a-adf8-c6ef24fcc54c\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:17:59.876111 master-0 kubenswrapper[16352]: I0307 21:17:59.876079 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.876255 master-0 kubenswrapper[16352]: I0307 21:17:59.876177 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:59.876481 master-0 kubenswrapper[16352]: I0307 21:17:59.876262 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:59.876659 master-0 kubenswrapper[16352]: I0307 21:17:59.876504 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.876753 master-0 kubenswrapper[16352]: I0307 21:17:59.876730 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:17:59.876918 master-0 kubenswrapper[16352]: I0307 21:17:59.876887 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:17:59.876972 master-0 kubenswrapper[16352]: I0307 21:17:59.876947 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:17:59.877040 master-0 kubenswrapper[16352]: I0307 21:17:59.876999 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:17:59.877082 master-0 kubenswrapper[16352]: I0307 21:17:59.877014 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:17:59.877176 master-0 kubenswrapper[16352]: I0307 21:17:59.877154 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:17:59.877228 master-0 kubenswrapper[16352]: I0307 21:17:59.877211 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:17:59.877268 master-0 kubenswrapper[16352]: I0307 21:17:59.877248 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:17:59.877826 master-0 kubenswrapper[16352]: I0307 21:17:59.877799 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:17:59.878533 master-0 kubenswrapper[16352]: I0307 21:17:59.878488 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-5m62w" Mar 07 21:17:59.906140 master-0 kubenswrapper[16352]: I0307 21:17:59.906080 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 07 21:17:59.914170 master-0 kubenswrapper[16352]: I0307 21:17:59.914104 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cco-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cco-trusted-ca\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:59.916995 master-0 kubenswrapper[16352]: I0307 21:17:59.916959 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 07 21:17:59.924505 master-0 kubenswrapper[16352]: I0307 21:17:59.924457 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-credential-operator-serving-cert\" (UniqueName: \"kubernetes.io/secret/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-cloud-credential-operator-serving-cert\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:17:59.937659 master-0 kubenswrapper[16352]: I0307 21:17:59.937590 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 07 21:17:59.957578 master-0 kubenswrapper[16352]: I0307 21:17:59.957523 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2z9v6" Mar 07 21:17:59.978705 master-0 kubenswrapper[16352]: I0307 21:17:59.978469 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 21:17:59.999174 master-0 kubenswrapper[16352]: I0307 21:17:59.999114 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 21:18:00.008417 master-0 kubenswrapper[16352]: I0307 21:18:00.008354 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/7f69a884-5fe8-4c03-8258-ff35396efc30-proxy-tls\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:18:00.019350 master-0 kubenswrapper[16352]: I0307 21:18:00.019269 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 21:18:00.026080 master-0 kubenswrapper[16352]: I0307 21:18:00.026014 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-images\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:18:00.038762 master-0 kubenswrapper[16352]: I0307 21:18:00.038703 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 21:18:00.045641 master-0 kubenswrapper[16352]: I0307 21:18:00.045583 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-mcd-auth-proxy-config\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:18:00.055739 master-0 kubenswrapper[16352]: I0307 21:18:00.047057 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/7f69a884-5fe8-4c03-8258-ff35396efc30-auth-proxy-config\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:18:00.055739 master-0 kubenswrapper[16352]: I0307 21:18:00.047256 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/5446df8b-23d4-4bf3-84ac-d8e1d18813af-mcc-auth-proxy-config\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:18:00.058358 master-0 kubenswrapper[16352]: I0307 21:18:00.058296 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 21:18:00.079651 master-0 kubenswrapper[16352]: I0307 21:18:00.079567 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wq5zr" Mar 07 21:18:00.098078 master-0 kubenswrapper[16352]: I0307 21:18:00.098022 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 07 21:18:00.108210 master-0 kubenswrapper[16352]: I0307 21:18:00.108142 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12701eb-4226-4f9c-9398-ad0c3fea7451-cert\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:18:00.118714 master-0 kubenswrapper[16352]: I0307 21:18:00.118633 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 07 21:18:00.124008 master-0 kubenswrapper[16352]: I0307 21:18:00.123962 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/b12701eb-4226-4f9c-9398-ad0c3fea7451-auth-proxy-config\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:18:00.138495 master-0 kubenswrapper[16352]: I0307 21:18:00.138420 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 21:18:00.148732 master-0 kubenswrapper[16352]: I0307 21:18:00.148671 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-samples-operator-tls\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:18:00.157676 master-0 kubenswrapper[16352]: I0307 21:18:00.157594 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 21:18:00.178011 master-0 kubenswrapper[16352]: I0307 21:18:00.177966 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 21:18:00.198991 master-0 kubenswrapper[16352]: I0307 21:18:00.198895 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 21:18:00.204005 master-0 kubenswrapper[16352]: I0307 21:18:00.203942 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-proxy-tls\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:18:00.216597 master-0 kubenswrapper[16352]: I0307 21:18:00.216500 16352 request.go:700] Waited for 1.990239573s due to client-side throttling, not priority and fairness, request: GET:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-daemon-dockercfg-w2xft&limit=500&resourceVersion=0 Mar 07 21:18:00.218609 master-0 kubenswrapper[16352]: I0307 21:18:00.218553 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-w2xft" Mar 07 21:18:00.239283 master-0 kubenswrapper[16352]: I0307 21:18:00.239120 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 21:18:00.244185 master-0 kubenswrapper[16352]: I0307 21:18:00.244132 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-webhook-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:18:00.248006 master-0 kubenswrapper[16352]: I0307 21:18:00.247927 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-apiservice-cert\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:18:00.258750 master-0 kubenswrapper[16352]: I0307 21:18:00.258635 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-6xbgq" Mar 07 21:18:00.277626 master-0 kubenswrapper[16352]: I0307 21:18:00.277535 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 21:18:00.285204 master-0 kubenswrapper[16352]: I0307 21:18:00.285116 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-auth-proxy-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:18:00.308784 master-0 kubenswrapper[16352]: I0307 21:18:00.306031 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 21:18:00.313353 master-0 kubenswrapper[16352]: I0307 21:18:00.313272 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/e3fe386a-dea8-484a-b95a-0f3f475b1f82-machine-approver-tls\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:18:00.320456 master-0 kubenswrapper[16352]: I0307 21:18:00.320388 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fswfb" Mar 07 21:18:00.337879 master-0 kubenswrapper[16352]: I0307 21:18:00.337785 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 21:18:00.346008 master-0 kubenswrapper[16352]: I0307 21:18:00.345941 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e3fe386a-dea8-484a-b95a-0f3f475b1f82-config\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:18:00.358140 master-0 kubenswrapper[16352]: I0307 21:18:00.358086 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 21:18:00.400420 master-0 kubenswrapper[16352]: I0307 21:18:00.399980 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 07 21:18:00.400420 master-0 kubenswrapper[16352]: I0307 21:18:00.400002 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-cdmkh" Mar 07 21:18:00.441192 master-0 kubenswrapper[16352]: I0307 21:18:00.407157 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-images\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:18:00.446162 master-0 kubenswrapper[16352]: I0307 21:18:00.446085 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:18:00.446329 master-0 kubenswrapper[16352]: I0307 21:18:00.446224 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 07 21:18:00.459755 master-0 kubenswrapper[16352]: I0307 21:18:00.456260 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ca25117a-ccd5-4628-8342-e277bb7be0e2-auth-proxy-config\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:18:00.459755 master-0 kubenswrapper[16352]: I0307 21:18:00.457997 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:18:00.478057 master-0 kubenswrapper[16352]: I0307 21:18:00.477962 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lvvbn" Mar 07 21:18:00.497903 master-0 kubenswrapper[16352]: I0307 21:18:00.497784 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 21:18:00.518093 master-0 kubenswrapper[16352]: I0307 21:18:00.518039 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-z5sb9" Mar 07 21:18:00.536930 master-0 kubenswrapper[16352]: I0307 21:18:00.536865 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 21:18:00.538122 master-0 kubenswrapper[16352]: I0307 21:18:00.538081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/5446df8b-23d4-4bf3-84ac-d8e1d18813af-proxy-tls\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:18:00.558278 master-0 kubenswrapper[16352]: I0307 21:18:00.558208 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 07 21:18:00.564268 master-0 kubenswrapper[16352]: I0307 21:18:00.564217 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cloud-controller-manager-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca25117a-ccd5-4628-8342-e277bb7be0e2-cloud-controller-manager-operator-tls\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:18:00.577295 master-0 kubenswrapper[16352]: I0307 21:18:00.577249 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-x6w69" Mar 07 21:18:00.597944 master-0 kubenswrapper[16352]: I0307 21:18:00.597893 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 21:18:00.606409 master-0 kubenswrapper[16352]: I0307 21:18:00.606349 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-node-bootstrap-token\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:18:00.618529 master-0 kubenswrapper[16352]: I0307 21:18:00.618475 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 21:18:00.625350 master-0 kubenswrapper[16352]: I0307 21:18:00.625303 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/599c055c-3517-46cb-b584-0050b12a7dea-certs\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:18:00.755537 master-0 kubenswrapper[16352]: I0307 21:18:00.755405 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"controller-manager-86d86fcf49-hgbkg\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:18:00.755537 master-0 kubenswrapper[16352]: I0307 21:18:00.755428 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tq99k\" (UniqueName: \"kubernetes.io/projected/c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d-kube-api-access-tq99k\") pod \"packageserver-f5bf97fcc-w82vx\" (UID: \"c0b2e9b2-e096-4d4d-911f-bc68e2d0ec0d\") " pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:18:00.759531 master-0 kubenswrapper[16352]: I0307 21:18:00.759490 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwj77\" (UniqueName: \"kubernetes.io/projected/c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9-kube-api-access-pwj77\") pod \"community-operators-rw59s\" (UID: \"c595b10d-f1b2-46bc-8a37-bcdc0ec3dca9\") " pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:18:00.781435 master-0 kubenswrapper[16352]: I0307 21:18:00.781371 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwzgb\" (UniqueName: \"kubernetes.io/projected/15270349-f3aa-43bc-88a8-f0fff3aa2528-kube-api-access-qwzgb\") pod \"network-check-target-fr4qr\" (UID: \"15270349-f3aa-43bc-88a8-f0fff3aa2528\") " pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:18:00.781641 master-0 kubenswrapper[16352]: I0307 21:18:00.781480 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wjtgs\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-kube-api-access-wjtgs\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:18:00.783935 master-0 kubenswrapper[16352]: I0307 21:18:00.783897 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsspm\" (UniqueName: \"kubernetes.io/projected/e543d99f-e0dc-49be-95bd-c39eabd05ce8-kube-api-access-dsspm\") pod \"kube-storage-version-migrator-operator-7f65c457f5-bczvd\" (UID: \"e543d99f-e0dc-49be-95bd-c39eabd05ce8\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-7f65c457f5-bczvd" Mar 07 21:18:01.237502 master-0 kubenswrapper[16352]: I0307 21:18:01.236405 16352 request.go:700] Waited for 2.931245421s due to client-side throttling, not priority and fairness, request: POST:https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-monitoring/serviceaccounts/cluster-monitoring-operator/token Mar 07 21:18:01.472797 master-0 kubenswrapper[16352]: I0307 21:18:01.472651 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65pgv\" (UniqueName: \"kubernetes.io/projected/e720291b-0f96-4ebb-80f2-5df7cb194ffc-kube-api-access-65pgv\") pod \"package-server-manager-854648ff6d-kr9ft\" (UID: \"e720291b-0f96-4ebb-80f2-5df7cb194ffc\") " pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:18:01.490821 master-0 kubenswrapper[16352]: E0307 21:18:01.490650 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="pods \"kube-rbac-proxy-crio-master-0\" already exists" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-master-0" Mar 07 21:18:01.498445 master-0 kubenswrapper[16352]: I0307 21:18:01.498385 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bqlq\" (UniqueName: \"kubernetes.io/projected/599c055c-3517-46cb-b584-0050b12a7dea-kube-api-access-6bqlq\") pod \"machine-config-server-xskwx\" (UID: \"599c055c-3517-46cb-b584-0050b12a7dea\") " pod="openshift-machine-config-operator/machine-config-server-xskwx" Mar 07 21:18:01.500956 master-0 kubenswrapper[16352]: I0307 21:18:01.500906 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kjhvg\" (UniqueName: \"kubernetes.io/projected/f8980370-267c-4168-ba97-d780698533ff-kube-api-access-kjhvg\") pod \"network-operator-7c649bf6d4-v4xm9\" (UID: \"f8980370-267c-4168-ba97-d780698533ff\") " pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" Mar 07 21:18:01.507078 master-0 kubenswrapper[16352]: I0307 21:18:01.507015 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgwj6\" (UniqueName: \"kubernetes.io/projected/dd310b71-6c79-4169-8b8a-7b3fe35a97fd-kube-api-access-dgwj6\") pod \"network-metrics-daemon-l2bdp\" (UID: \"dd310b71-6c79-4169-8b8a-7b3fe35a97fd\") " pod="openshift-multus/network-metrics-daemon-l2bdp" Mar 07 21:18:01.507248 master-0 kubenswrapper[16352]: I0307 21:18:01.507141 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3faedef9-d507-48aa-82a8-f3dc9b5adeef-kube-api-access\") pod \"openshift-kube-scheduler-operator-5c74bfc494-85z7m\" (UID: \"3faedef9-d507-48aa-82a8-f3dc9b5adeef\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5c74bfc494-85z7m" Mar 07 21:18:01.508192 master-0 kubenswrapper[16352]: I0307 21:18:01.508133 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tpztb\" (UniqueName: \"kubernetes.io/projected/420c6d8f-6313-4d6c-b817-420797fc6878-kube-api-access-tpztb\") pod \"ovnkube-node-x9v76\" (UID: \"420c6d8f-6313-4d6c-b817-420797fc6878\") " pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:01.509103 master-0 kubenswrapper[16352]: I0307 21:18:01.509050 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjt7j\" (UniqueName: \"kubernetes.io/projected/290f6cf4-daa1-4cae-8e91-2411bf81f8b4-kube-api-access-zjt7j\") pod \"catalogd-controller-manager-7f8b8b6f4c-mc2rc\" (UID: \"290f6cf4-daa1-4cae-8e91-2411bf81f8b4\") " pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:18:01.511329 master-0 kubenswrapper[16352]: I0307 21:18:01.511278 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgkz\" (UniqueName: \"kubernetes.io/projected/ca25117a-ccd5-4628-8342-e277bb7be0e2-kube-api-access-9kgkz\") pod \"cluster-cloud-controller-manager-operator-7c8df9b496-wp42j\" (UID: \"ca25117a-ccd5-4628-8342-e277bb7be0e2\") " pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" Mar 07 21:18:01.512507 master-0 kubenswrapper[16352]: I0307 21:18:01.512458 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2b28m\" (UniqueName: \"kubernetes.io/projected/7f65054f-caf3-4cd3-889e-8d5a5376b1b8-kube-api-access-2b28m\") pod \"redhat-marketplace-z2cc9\" (UID: \"7f65054f-caf3-4cd3-889e-8d5a5376b1b8\") " pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:18:01.514992 master-0 kubenswrapper[16352]: I0307 21:18:01.514946 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbz9p\" (UniqueName: \"kubernetes.io/projected/a9d64cd1-bd5b-4fbc-972b-000a03c854fe-kube-api-access-zbz9p\") pod \"cluster-monitoring-operator-674cbfbd9d-czm5f\" (UID: \"a9d64cd1-bd5b-4fbc-972b-000a03c854fe\") " pod="openshift-monitoring/cluster-monitoring-operator-674cbfbd9d-czm5f" Mar 07 21:18:01.516712 master-0 kubenswrapper[16352]: I0307 21:18:01.516630 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpjms\" (UniqueName: \"kubernetes.io/projected/d50f92ea-1c78-4535-a14c-96b00f2cf377-kube-api-access-jpjms\") pod \"router-default-79f8cd6fdd-858hg\" (UID: \"d50f92ea-1c78-4535-a14c-96b00f2cf377\") " pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:18:02.027330 master-0 kubenswrapper[16352]: I0307 21:18:02.027257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jcxp\" (UniqueName: \"kubernetes.io/projected/183a5212-1b21-44e4-9ed5-2f63f76e652e-kube-api-access-2jcxp\") pod \"operator-controller-controller-manager-6598bfb6c4-mlxbw\" (UID: \"183a5212-1b21-44e4-9ed5-2f63f76e652e\") " pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:18:03.485299 master-0 kubenswrapper[16352]: I0307 21:18:03.485221 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w94dz\" (UniqueName: \"kubernetes.io/projected/666475e5-df4b-44ef-a2d4-39d84ab91aad-kube-api-access-w94dz\") pod \"iptables-alerter-n8nz9\" (UID: \"666475e5-df4b-44ef-a2d4-39d84ab91aad\") " pod="openshift-network-operator/iptables-alerter-n8nz9" Mar 07 21:18:03.486088 master-0 kubenswrapper[16352]: I0307 21:18:03.485732 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/abfb5602-7255-43d7-a510-e7f94885887e-kube-api-access\") pod \"kube-controller-manager-operator-86d7cdfdfb-wb26b\" (UID: \"abfb5602-7255-43d7-a510-e7f94885887e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" Mar 07 21:18:03.488550 master-0 kubenswrapper[16352]: I0307 21:18:03.488493 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hqxlr\" (UniqueName: \"kubernetes.io/projected/f08edf29-c53f-452d-880b-e8ce27b05b6f-kube-api-access-hqxlr\") pod \"certified-operators-vxpb5\" (UID: \"f08edf29-c53f-452d-880b-e8ce27b05b6f\") " pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:18:03.489122 master-0 kubenswrapper[16352]: I0307 21:18:03.489083 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp45l\" (UniqueName: \"kubernetes.io/projected/1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c-kube-api-access-rp45l\") pod \"control-plane-machine-set-operator-6686554ddc-dgjgz\" (UID: \"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c\") " pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" Mar 07 21:18:03.503235 master-0 kubenswrapper[16352]: I0307 21:18:03.503185 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4h4st\" (UniqueName: \"kubernetes.io/projected/46548c2c-6a8a-4382-87de-2c7a8442a33c-kube-api-access-4h4st\") pod \"ovnkube-control-plane-66b55d57d-mc46k\" (UID: \"46548c2c-6a8a-4382-87de-2c7a8442a33c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" Mar 07 21:18:03.504016 master-0 kubenswrapper[16352]: I0307 21:18:03.503976 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnnlw\" (UniqueName: \"kubernetes.io/projected/ff7c5ff2-49d2-4a55-96d1-5244ae8ad602-kube-api-access-gnnlw\") pod \"authentication-operator-7c6989d6c4-7w8wf\" (UID: \"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602\") " pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" Mar 07 21:18:03.504604 master-0 kubenswrapper[16352]: I0307 21:18:03.504547 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5b339e6a-cae6-416a-963b-2fd23cecba96-kube-api-access\") pod \"kube-apiserver-operator-68bd585b-qnhrz\" (UID: \"5b339e6a-cae6-416a-963b-2fd23cecba96\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" Mar 07 21:18:03.505059 master-0 kubenswrapper[16352]: I0307 21:18:03.504984 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wps6\" (UniqueName: \"kubernetes.io/projected/6d5765e6-80cc-404b-b375-c109febd1843-kube-api-access-8wps6\") pod \"network-check-source-7c67b67d47-88mpr\" (UID: \"6d5765e6-80cc-404b-b375-c109febd1843\") " pod="openshift-network-diagnostics/network-check-source-7c67b67d47-88mpr" Mar 07 21:18:03.505173 master-0 kubenswrapper[16352]: I0307 21:18:03.505118 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2tvr\" (UniqueName: \"kubernetes.io/projected/bd633b72-3d0b-4601-a2c2-3f487d943b35-kube-api-access-p2tvr\") pod \"openshift-controller-manager-operator-8565d84698-98wdp\" (UID: \"bd633b72-3d0b-4601-a2c2-3f487d943b35\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-8565d84698-98wdp" Mar 07 21:18:03.507432 master-0 kubenswrapper[16352]: I0307 21:18:03.507391 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7nz\" (UniqueName: \"kubernetes.io/projected/f2ca65f5-7dbe-4407-b38e-713592f62136-kube-api-access-fs7nz\") pod \"node-resolver-zhkfm\" (UID: \"f2ca65f5-7dbe-4407-b38e-713592f62136\") " pod="openshift-dns/node-resolver-zhkfm" Mar 07 21:18:03.508005 master-0 kubenswrapper[16352]: I0307 21:18:03.507969 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cz8d\" (UniqueName: \"kubernetes.io/projected/655b9f0a-cf27-443d-b0ea-3642dcae1ad2-kube-api-access-7cz8d\") pod \"machine-config-daemon-kp74q\" (UID: \"655b9f0a-cf27-443d-b0ea-3642dcae1ad2\") " pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:18:03.508280 master-0 kubenswrapper[16352]: I0307 21:18:03.508204 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktjs9\" (UniqueName: \"kubernetes.io/projected/bd9cf577-3c49-417b-a6c0-9d307c113221-kube-api-access-ktjs9\") pod \"cluster-storage-operator-6fbfc8dc8f-v48jn\" (UID: \"bd9cf577-3c49-417b-a6c0-9d307c113221\") " pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" Mar 07 21:18:03.508855 master-0 kubenswrapper[16352]: I0307 21:18:03.508817 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f748l\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-kube-api-access-f748l\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:18:03.509327 master-0 kubenswrapper[16352]: I0307 21:18:03.509276 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2-bound-sa-token\") pod \"cluster-image-registry-operator-86d6d77c7c-kg26q\" (UID: \"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2\") " pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" Mar 07 21:18:03.515405 master-0 kubenswrapper[16352]: E0307 21:18:03.515369 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:03.515405 master-0 kubenswrapper[16352]: E0307 21:18:03.515399 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:03.515589 master-0 kubenswrapper[16352]: E0307 21:18:03.515467 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:04.015446499 +0000 UTC m=+7.086151558 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:03.519154 master-0 kubenswrapper[16352]: I0307 21:18:03.519109 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqtbf\" (UniqueName: \"kubernetes.io/projected/8512a7f6-889f-483e-960f-1ce3c834e92c-kube-api-access-fqtbf\") pod \"insights-operator-8f89dfddd-rlx9x\" (UID: \"8512a7f6-889f-483e-960f-1ce3c834e92c\") " pod="openshift-insights/insights-operator-8f89dfddd-rlx9x" Mar 07 21:18:03.519483 master-0 kubenswrapper[16352]: I0307 21:18:03.519443 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpck7\" (UniqueName: \"kubernetes.io/projected/e3fe386a-dea8-484a-b95a-0f3f475b1f82-kube-api-access-fpck7\") pod \"machine-approver-754bdc9f9d-bbz7l\" (UID: \"e3fe386a-dea8-484a-b95a-0f3f475b1f82\") " pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" Mar 07 21:18:03.520211 master-0 kubenswrapper[16352]: I0307 21:18:03.520172 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/47ecf172-666e-4360-97ff-bd9dbccc1fd6-bound-sa-token\") pod \"ingress-operator-677db989d6-tklw9\" (UID: \"47ecf172-666e-4360-97ff-bd9dbccc1fd6\") " pod="openshift-ingress-operator/ingress-operator-677db989d6-tklw9" Mar 07 21:18:03.520351 master-0 kubenswrapper[16352]: I0307 21:18:03.520321 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-drnv4\" (UniqueName: \"kubernetes.io/projected/46d1b044-16fb-4442-a554-6b15a8a1c8ae-kube-api-access-drnv4\") pod \"machine-api-operator-84bf6db4f9-t8jw4\" (UID: \"46d1b044-16fb-4442-a554-6b15a8a1c8ae\") " pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" Mar 07 21:18:03.522136 master-0 kubenswrapper[16352]: I0307 21:18:03.522107 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nnk5\" (UniqueName: \"kubernetes.io/projected/7fa7b789-9201-493e-a96d-484a2622301a-kube-api-access-5nnk5\") pod \"csi-snapshot-controller-7577d6f48-kzjmp\" (UID: \"7fa7b789-9201-493e-a96d-484a2622301a\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" Mar 07 21:18:03.522221 master-0 kubenswrapper[16352]: I0307 21:18:03.522186 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbggb\" (UniqueName: \"kubernetes.io/projected/61a9fce6-50e1-413c-9ec0-177d6e903bdd-kube-api-access-jbggb\") pod \"dns-operator-589895fbb7-wqqqr\" (UID: \"61a9fce6-50e1-413c-9ec0-177d6e903bdd\") " pod="openshift-dns-operator/dns-operator-589895fbb7-wqqqr" Mar 07 21:18:03.522765 master-0 kubenswrapper[16352]: I0307 21:18:03.522719 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lng9v\" (UniqueName: \"kubernetes.io/projected/69851821-e1fc-44a8-98df-0cfe9d564126-kube-api-access-lng9v\") pod \"olm-operator-d64cfc9db-qd6xh\" (UID: \"69851821-e1fc-44a8-98df-0cfe9d564126\") " pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:18:03.522833 master-0 kubenswrapper[16352]: I0307 21:18:03.522736 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lbmm\" (UniqueName: \"kubernetes.io/projected/7d462ed3-d191-42a5-b8e0-79ab9af13991-kube-api-access-4lbmm\") pod \"apiserver-67cf6dffcb-4z6hx\" (UID: \"7d462ed3-d191-42a5-b8e0-79ab9af13991\") " pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:18:03.523045 master-0 kubenswrapper[16352]: I0307 21:18:03.523012 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"multus-admission-controller-8d675b596-mmqbs\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:18:03.523446 master-0 kubenswrapper[16352]: I0307 21:18:03.523421 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96cfa9d3-fc26-42e9-8bac-ff2c25223654-kube-api-access\") pod \"cluster-version-operator-8c9c967c7-s44f4\" (UID: \"96cfa9d3-fc26-42e9-8bac-ff2c25223654\") " pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" Mar 07 21:18:03.525529 master-0 kubenswrapper[16352]: I0307 21:18:03.525499 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qskh\" (UniqueName: \"kubernetes.io/projected/5f82d4aa-0cb5-477f-944e-745a21d124fc-kube-api-access-6qskh\") pod \"etcd-operator-5884b9cd56-lc94h\" (UID: \"5f82d4aa-0cb5-477f-944e-745a21d124fc\") " pod="openshift-etcd-operator/etcd-operator-5884b9cd56-lc94h" Mar 07 21:18:03.526405 master-0 kubenswrapper[16352]: I0307 21:18:03.526378 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zb5zm\" (UniqueName: \"kubernetes.io/projected/24f69689-ff12-4786-af05-61429e9eadf8-kube-api-access-zb5zm\") pod \"service-ca-operator-69b6fc6b88-cg9rz\" (UID: \"24f69689-ff12-4786-af05-61429e9eadf8\") " pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" Mar 07 21:18:03.526847 master-0 kubenswrapper[16352]: I0307 21:18:03.526817 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mzlv\" (UniqueName: \"kubernetes.io/projected/c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021-kube-api-access-9mzlv\") pod \"cluster-samples-operator-664cb58b85-fmzk7\" (UID: \"c5a5e5cc-bbbf-480c-8ee5-aa0a031d8021\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-664cb58b85-fmzk7" Mar 07 21:18:03.528871 master-0 kubenswrapper[16352]: I0307 21:18:03.528838 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmp5q\" (UniqueName: \"kubernetes.io/projected/4e94f64e-4a89-4d9d-acbd-80f86bf2f964-kube-api-access-vmp5q\") pod \"dns-default-hm77f\" (UID: \"4e94f64e-4a89-4d9d-acbd-80f86bf2f964\") " pod="openshift-dns/dns-default-hm77f" Mar 07 21:18:03.528871 master-0 kubenswrapper[16352]: I0307 21:18:03.528865 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rxkw8\" (UniqueName: \"kubernetes.io/projected/a61a736a-66e5-4ca1-a8a7-088cf73cfcce-kube-api-access-rxkw8\") pod \"cluster-baremetal-operator-5cdb4c5598-nmwjr\" (UID: \"a61a736a-66e5-4ca1-a8a7-088cf73cfcce\") " pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" Mar 07 21:18:03.530320 master-0 kubenswrapper[16352]: I0307 21:18:03.530288 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zppz\" (UniqueName: \"kubernetes.io/projected/e38fc940-e59a-45ff-978b-fdcdc534a2a5-kube-api-access-2zppz\") pod \"migrator-57ccdf9b5-5l6h9\" (UID: \"e38fc940-e59a-45ff-978b-fdcdc534a2a5\") " pod="openshift-kube-storage-version-migrator/migrator-57ccdf9b5-5l6h9" Mar 07 21:18:03.540232 master-0 kubenswrapper[16352]: I0307 21:18:03.540200 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2w44\" (UniqueName: \"kubernetes.io/projected/29624e4f-d970-4dfa-a8f1-515b73397c8f-kube-api-access-l2w44\") pod \"openshift-config-operator-64488f9d78-cb227\" (UID: \"29624e4f-d970-4dfa-a8f1-515b73397c8f\") " pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:18:03.540725 master-0 kubenswrapper[16352]: I0307 21:18:03.540697 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2j2bf\" (UniqueName: \"kubernetes.io/projected/3caff2c1-f178-4e16-916d-27ccf178ff37-kube-api-access-2j2bf\") pod \"multus-additional-cni-plugins-xf7kg\" (UID: \"3caff2c1-f178-4e16-916d-27ccf178ff37\") " pod="openshift-multus/multus-additional-cni-plugins-xf7kg" Mar 07 21:18:03.541543 master-0 kubenswrapper[16352]: I0307 21:18:03.541506 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5n27m\" (UniqueName: \"kubernetes.io/projected/7f69a884-5fe8-4c03-8258-ff35396efc30-kube-api-access-5n27m\") pod \"machine-config-operator-fdb5c78b5-rk7q8\" (UID: \"7f69a884-5fe8-4c03-8258-ff35396efc30\") " pod="openshift-machine-config-operator/machine-config-operator-fdb5c78b5-rk7q8" Mar 07 21:18:03.541648 master-0 kubenswrapper[16352]: I0307 21:18:03.541623 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjbmk\" (UniqueName: \"kubernetes.io/projected/ab2f6566-730d-46f5-92ed-79e3039d24e8-kube-api-access-vjbmk\") pod \"csi-snapshot-controller-operator-5685fbc7d-txnh5\" (UID: \"ab2f6566-730d-46f5-92ed-79e3039d24e8\") " pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" Mar 07 21:18:03.542318 master-0 kubenswrapper[16352]: I0307 21:18:03.542285 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-khdpn\" (UniqueName: \"kubernetes.io/projected/5625eb9f-c80b-47b1-b70c-aa636fbc03ac-kube-api-access-khdpn\") pod \"redhat-operators-fdltd\" (UID: \"5625eb9f-c80b-47b1-b70c-aa636fbc03ac\") " pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:18:03.542727 master-0 kubenswrapper[16352]: I0307 21:18:03.542702 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69jxd\" (UniqueName: \"kubernetes.io/projected/e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9-kube-api-access-69jxd\") pod \"apiserver-694d775589-btnh4\" (UID: \"e8f9c2bb-0b0e-48e0-8728-f2a460dc69e9\") " pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:18:03.543490 master-0 kubenswrapper[16352]: I0307 21:18:03.543458 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9rq\" (UniqueName: \"kubernetes.io/projected/7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149-kube-api-access-6f9rq\") pod \"catalog-operator-7d9c49f57b-j454x\" (UID: \"7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149\") " pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:18:03.543975 master-0 kubenswrapper[16352]: I0307 21:18:03.543944 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f8mm9\" (UniqueName: \"kubernetes.io/projected/b12701eb-4226-4f9c-9398-ad0c3fea7451-kube-api-access-f8mm9\") pod \"cluster-autoscaler-operator-69576476f7-dqvvb\" (UID: \"b12701eb-4226-4f9c-9398-ad0c3fea7451\") " pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" Mar 07 21:18:03.545343 master-0 kubenswrapper[16352]: I0307 21:18:03.545316 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hx8ck\" (UniqueName: \"kubernetes.io/projected/b269ae2f-44ff-46c7-9039-21fca4a7a790-kube-api-access-hx8ck\") pod \"multus-g6nmq\" (UID: \"b269ae2f-44ff-46c7-9039-21fca4a7a790\") " pod="openshift-multus/multus-g6nmq" Mar 07 21:18:03.546185 master-0 kubenswrapper[16352]: I0307 21:18:03.546143 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqwrr\" (UniqueName: \"kubernetes.io/projected/b88c5fbe-e19f-45b3-ab03-e1626f95776d-kube-api-access-kqwrr\") pod \"openshift-apiserver-operator-799b6db4d7-jtbd6\" (UID: \"b88c5fbe-e19f-45b3-ab03-e1626f95776d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-799b6db4d7-jtbd6" Mar 07 21:18:03.546388 master-0 kubenswrapper[16352]: I0307 21:18:03.546356 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f72ps\" (UniqueName: \"kubernetes.io/projected/27b149f7-6aff-45f3-b935-e65279f2f9ee-kube-api-access-f72ps\") pod \"network-node-identity-kpsm4\" (UID: \"27b149f7-6aff-45f3-b935-e65279f2f9ee\") " pod="openshift-network-node-identity/network-node-identity-kpsm4" Mar 07 21:18:03.547021 master-0 kubenswrapper[16352]: I0307 21:18:03.546999 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"route-controller-manager-cdf659ffc-4969h\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:18:03.553271 master-0 kubenswrapper[16352]: I0307 21:18:03.553227 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ds84\" (UniqueName: \"kubernetes.io/projected/2369ce94-237f-41ad-9875-173578764483-kube-api-access-4ds84\") pod \"service-ca-84bfdbbb7f-h76wh\" (UID: \"2369ce94-237f-41ad-9875-173578764483\") " pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" Mar 07 21:18:03.553557 master-0 kubenswrapper[16352]: I0307 21:18:03.553524 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9mmg\" (UniqueName: \"kubernetes.io/projected/85bb04ed-e2d1-496d-8f2c-9555bb3c5d78-kube-api-access-d9mmg\") pod \"cloud-credential-operator-55d85b7b47-7tb74\" (UID: \"85bb04ed-e2d1-496d-8f2c-9555bb3c5d78\") " pod="openshift-cloud-credential-operator/cloud-credential-operator-55d85b7b47-7tb74" Mar 07 21:18:03.559932 master-0 kubenswrapper[16352]: I0307 21:18:03.559885 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2gv7\" (UniqueName: \"kubernetes.io/projected/5446df8b-23d4-4bf3-84ac-d8e1d18813af-kube-api-access-k2gv7\") pod \"machine-config-controller-ff46b7bdf-55p6v\" (UID: \"5446df8b-23d4-4bf3-84ac-d8e1d18813af\") " pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" Mar 07 21:18:03.560405 master-0 kubenswrapper[16352]: I0307 21:18:03.560275 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-87fml\" (UniqueName: \"kubernetes.io/projected/bbc6fdd7-cbf1-416d-a986-bbd6ba259c05-kube-api-access-87fml\") pod \"tuned-qzjmv\" (UID: \"bbc6fdd7-cbf1-416d-a986-bbd6ba259c05\") " pod="openshift-cluster-node-tuning-operator/tuned-qzjmv" Mar 07 21:18:03.561117 master-0 kubenswrapper[16352]: I0307 21:18:03.561080 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wvpvs\" (UniqueName: \"kubernetes.io/projected/8269652e-360f-43ef-9e7d-473c5f478275-kube-api-access-wvpvs\") pod \"cluster-olm-operator-77899cf6d-cgdkk\" (UID: \"8269652e-360f-43ef-9e7d-473c5f478275\") " pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" Mar 07 21:18:03.566955 master-0 kubenswrapper[16352]: I0307 21:18:03.566898 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t24zr\" (UniqueName: \"kubernetes.io/projected/f8c93e0d-54e5-4c80-9d69-a70317baeacf-kube-api-access-t24zr\") pod \"cluster-node-tuning-operator-66c7586884-sxqnh\" (UID: \"f8c93e0d-54e5-4c80-9d69-a70317baeacf\") " pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" Mar 07 21:18:03.567544 master-0 kubenswrapper[16352]: I0307 21:18:03.567499 16352 kubelet_node_status.go:115] "Node was previously registered" node="master-0" Mar 07 21:18:03.567622 master-0 kubenswrapper[16352]: I0307 21:18:03.567613 16352 kubelet_node_status.go:79] "Successfully registered node" node="master-0" Mar 07 21:18:03.574316 master-0 kubenswrapper[16352]: I0307 21:18:03.574262 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c76ff\" (UniqueName: \"kubernetes.io/projected/fc392945-53ad-473c-8803-70e2026712d2-kube-api-access-c76ff\") pod \"marketplace-operator-64bf9778cb-q7hrg\" (UID: \"fc392945-53ad-473c-8803-70e2026712d2\") " pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:18:03.586470 master-0 kubenswrapper[16352]: E0307 21:18:03.586413 16352 kubelet.go:2526] "Housekeeping took longer than expected" err="housekeeping took too long" expected="1s" actual="4.398s" Mar 07 21:18:03.594663 master-0 kubenswrapper[16352]: I0307 21:18:03.594594 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" podUID="" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610243 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610285 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610304 16352 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dccc88d9-f88f-4d19-bad9-7cccf7e5a543" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610453 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610466 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610479 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610492 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/bootstrap-kube-apiserver-master-0"] Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610501 16352 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/bootstrap-kube-apiserver-master-0" mirrorPodUID="dccc88d9-f88f-4d19-bad9-7cccf7e5a543" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610558 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610641 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610771 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.610938 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.611058 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.611556 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-f5bf97fcc-w82vx" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.611794 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.611830 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-64488f9d78-cb227" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.611953 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613085 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613126 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613163 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613190 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613242 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613295 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613336 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613368 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-hm77f" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613399 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613444 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-hm77f" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613544 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:18:03.614304 master-0 kubenswrapper[16352]: I0307 21:18:03.613598 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:18:03.643413 master-0 kubenswrapper[16352]: I0307 21:18:03.643130 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 07 21:18:03.714581 master-0 kubenswrapper[16352]: I0307 21:18:03.714412 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.718095 master-0 kubenswrapper[16352]: I0307 21:18:03.717977 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:03.807554 master-0 kubenswrapper[16352]: I0307 21:18:03.807420 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:18:03.814997 master-0 kubenswrapper[16352]: I0307 21:18:03.814871 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=11.814849511 podStartE2EDuration="11.814849511s" podCreationTimestamp="2026-03-07 21:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:18:03.733306153 +0000 UTC m=+6.804011202" watchObservedRunningTime="2026-03-07 21:18:03.814849511 +0000 UTC m=+6.885554570" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: I0307 21:18:03.863356 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-7hzkm"] Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: E0307 21:18:03.863603 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: I0307 21:18:03.863670 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: E0307 21:18:03.863699 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: I0307 21:18:03.863706 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: E0307 21:18:03.863719 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: I0307 21:18:03.863726 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: E0307 21:18:03.863735 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:18:03.863716 master-0 kubenswrapper[16352]: I0307 21:18:03.863741 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: E0307 21:18:03.863752 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863760 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: E0307 21:18:03.863773 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2357c135-5d09-4657-9038-48d25ed55b2d" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863779 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2357c135-5d09-4657-9038-48d25ed55b2d" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: E0307 21:18:03.863788 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863794 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: E0307 21:18:03.863807 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863813 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863914 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e757a93e-91aa-4fce-949b-4c51a060528e" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863925 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc5c4a14-0fdc-4c09-abda-7a2277a20c54" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863934 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2357c135-5d09-4657-9038-48d25ed55b2d" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863945 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe626e91-8685-417b-b581-ef2dbd9e0ba9" containerName="assisted-installer-controller" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863962 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2d827a93-49e5-4694-b119-957cfa9bd648" containerName="prober" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863975 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f4ab99a-1ea2-4bf4-a987-5b6edadedc6b" containerName="cluster-version-operator" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.863992 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddc814a4-b865-4a35-b5f8-f54af449fe25" containerName="installer" Mar 07 21:18:03.864140 master-0 kubenswrapper[16352]: I0307 21:18:03.864000 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="34e734b7-82d6-493d-ace8-1945b2c08c6d" containerName="installer" Mar 07 21:18:03.864555 master-0 kubenswrapper[16352]: I0307 21:18:03.864379 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:03.867038 master-0 kubenswrapper[16352]: I0307 21:18:03.866998 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 21:18:03.867444 master-0 kubenswrapper[16352]: I0307 21:18:03.867284 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 21:18:03.867444 master-0 kubenswrapper[16352]: I0307 21:18:03.867427 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 21:18:03.881115 master-0 kubenswrapper[16352]: I0307 21:18:03.881066 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7hzkm"] Mar 07 21:18:03.969652 master-0 kubenswrapper[16352]: I0307 21:18:03.969599 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:03.970162 master-0 kubenswrapper[16352]: I0307 21:18:03.970139 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/705bd6f8-6937-4a16-b03e-5ad3bc684a89-kube-api-access-jbczf\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: I0307 21:18:04.071867 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: I0307 21:18:04.071963 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: I0307 21:18:04.072002 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/705bd6f8-6937-4a16-b03e-5ad3bc684a89-kube-api-access-jbczf\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: E0307 21:18:04.072555 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: E0307 21:18:04.072578 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: E0307 21:18:04.072622 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:05.072604998 +0000 UTC m=+8.143310057 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: E0307 21:18:04.072710 16352 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 07 21:18:04.074856 master-0 kubenswrapper[16352]: E0307 21:18:04.072736 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert podName:705bd6f8-6937-4a16-b03e-5ad3bc684a89 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:04.57272965 +0000 UTC m=+7.643434709 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert") pod "ingress-canary-7hzkm" (UID: "705bd6f8-6937-4a16-b03e-5ad3bc684a89") : secret "canary-serving-cert" not found Mar 07 21:18:04.093549 master-0 kubenswrapper[16352]: I0307 21:18:04.093483 16352 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 07 21:18:04.097854 master-0 kubenswrapper[16352]: I0307 21:18:04.097812 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jbczf\" (UniqueName: \"kubernetes.io/projected/705bd6f8-6937-4a16-b03e-5ad3bc684a89-kube-api-access-jbczf\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:04.119889 master-0 kubenswrapper[16352]: I0307 21:18:04.119803 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=12.119775545 podStartE2EDuration="12.119775545s" podCreationTimestamp="2026-03-07 21:17:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:18:04.118390761 +0000 UTC m=+7.189095820" watchObservedRunningTime="2026-03-07 21:18:04.119775545 +0000 UTC m=+7.190480604" Mar 07 21:18:04.127375 master-0 kubenswrapper[16352]: I0307 21:18:04.127322 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:18:04.170803 master-0 kubenswrapper[16352]: I0307 21:18:04.170736 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:18:04.297478 master-0 kubenswrapper[16352]: I0307 21:18:04.297370 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:04.305442 master-0 kubenswrapper[16352]: I0307 21:18:04.305363 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:18:04.305637 master-0 kubenswrapper[16352]: I0307 21:18:04.305493 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:04.588093 master-0 kubenswrapper[16352]: I0307 21:18:04.588013 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:04.588702 master-0 kubenswrapper[16352]: E0307 21:18:04.588289 16352 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 07 21:18:04.588702 master-0 kubenswrapper[16352]: E0307 21:18:04.588423 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert podName:705bd6f8-6937-4a16-b03e-5ad3bc684a89 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:05.588395838 +0000 UTC m=+8.659100897 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert") pod "ingress-canary-7hzkm" (UID: "705bd6f8-6937-4a16-b03e-5ad3bc684a89") : secret "canary-serving-cert" not found Mar 07 21:18:04.815751 master-0 kubenswrapper[16352]: I0307 21:18:04.815045 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t"] Mar 07 21:18:04.817309 master-0 kubenswrapper[16352]: I0307 21:18:04.817054 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.819759 master-0 kubenswrapper[16352]: I0307 21:18:04.819290 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-gvhc4" Mar 07 21:18:04.820259 master-0 kubenswrapper[16352]: I0307 21:18:04.820216 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 07 21:18:04.821160 master-0 kubenswrapper[16352]: I0307 21:18:04.820563 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 07 21:18:04.821160 master-0 kubenswrapper[16352]: I0307 21:18:04.821094 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 07 21:18:04.835088 master-0 kubenswrapper[16352]: I0307 21:18:04.835035 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t"] Mar 07 21:18:04.897018 master-0 kubenswrapper[16352]: I0307 21:18:04.896924 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.897018 master-0 kubenswrapper[16352]: I0307 21:18:04.896988 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.897018 master-0 kubenswrapper[16352]: I0307 21:18:04.897009 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s9khr\" (UniqueName: \"kubernetes.io/projected/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-kube-api-access-s9khr\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.897306 master-0 kubenswrapper[16352]: I0307 21:18:04.897070 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.998841 master-0 kubenswrapper[16352]: I0307 21:18:04.998760 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.999938 master-0 kubenswrapper[16352]: I0307 21:18:04.998853 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.999938 master-0 kubenswrapper[16352]: I0307 21:18:04.998896 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s9khr\" (UniqueName: \"kubernetes.io/projected/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-kube-api-access-s9khr\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.999938 master-0 kubenswrapper[16352]: I0307 21:18:04.999162 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:04.999938 master-0 kubenswrapper[16352]: E0307 21:18:04.999570 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 07 21:18:04.999938 master-0 kubenswrapper[16352]: E0307 21:18:04.999655 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls podName:9419e98f-3f8e-49d2-a8a2-945cc308f8b1 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:05.499631077 +0000 UTC m=+8.570336176 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-nvm8t" (UID: "9419e98f-3f8e-49d2-a8a2-945cc308f8b1") : secret "prometheus-operator-tls" not found Mar 07 21:18:05.000319 master-0 kubenswrapper[16352]: I0307 21:18:05.000262 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-metrics-client-ca\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:05.004790 master-0 kubenswrapper[16352]: I0307 21:18:05.004738 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:05.032511 master-0 kubenswrapper[16352]: I0307 21:18:05.032421 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s9khr\" (UniqueName: \"kubernetes.io/projected/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-kube-api-access-s9khr\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:05.100859 master-0 kubenswrapper[16352]: I0307 21:18:05.100777 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:05.101085 master-0 kubenswrapper[16352]: E0307 21:18:05.101040 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:05.101119 master-0 kubenswrapper[16352]: E0307 21:18:05.101087 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:05.101190 master-0 kubenswrapper[16352]: E0307 21:18:05.101171 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:07.101141965 +0000 UTC m=+10.171847044 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:05.305356 master-0 kubenswrapper[16352]: I0307 21:18:05.305162 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:05.394742 master-0 kubenswrapper[16352]: I0307 21:18:05.394606 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:18:05.398379 master-0 kubenswrapper[16352]: I0307 21:18:05.398310 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:18:05.489746 master-0 kubenswrapper[16352]: I0307 21:18:05.489602 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:18:05.508227 master-0 kubenswrapper[16352]: I0307 21:18:05.508154 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:05.508495 master-0 kubenswrapper[16352]: E0307 21:18:05.508367 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 07 21:18:05.508495 master-0 kubenswrapper[16352]: E0307 21:18:05.508431 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls podName:9419e98f-3f8e-49d2-a8a2-945cc308f8b1 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:06.508411648 +0000 UTC m=+9.579116707 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-nvm8t" (UID: "9419e98f-3f8e-49d2-a8a2-945cc308f8b1") : secret "prometheus-operator-tls" not found Mar 07 21:18:05.548816 master-0 kubenswrapper[16352]: I0307 21:18:05.544710 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-vxpb5" Mar 07 21:18:05.610282 master-0 kubenswrapper[16352]: I0307 21:18:05.610204 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:05.611183 master-0 kubenswrapper[16352]: E0307 21:18:05.610478 16352 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 07 21:18:05.611183 master-0 kubenswrapper[16352]: E0307 21:18:05.610648 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert podName:705bd6f8-6937-4a16-b03e-5ad3bc684a89 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:07.610609853 +0000 UTC m=+10.681314952 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert") pod "ingress-canary-7hzkm" (UID: "705bd6f8-6937-4a16-b03e-5ad3bc684a89") : secret "canary-serving-cert" not found Mar 07 21:18:05.741069 master-0 kubenswrapper[16352]: I0307 21:18:05.741002 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:18:05.748059 master-0 kubenswrapper[16352]: I0307 21:18:05.747886 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:18:05.835436 master-0 kubenswrapper[16352]: I0307 21:18:05.835301 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:18:05.839251 master-0 kubenswrapper[16352]: I0307 21:18:05.839162 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-fr4qr" Mar 07 21:18:06.191641 master-0 kubenswrapper[16352]: I0307 21:18:06.191530 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:06.232022 master-0 kubenswrapper[16352]: I0307 21:18:06.231943 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:06.311739 master-0 kubenswrapper[16352]: I0307 21:18:06.311606 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:06.311739 master-0 kubenswrapper[16352]: I0307 21:18:06.311652 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:06.516634 master-0 kubenswrapper[16352]: I0307 21:18:06.516462 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:18:06.522139 master-0 kubenswrapper[16352]: I0307 21:18:06.522064 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-8464df8497-lxzml" Mar 07 21:18:06.528706 master-0 kubenswrapper[16352]: I0307 21:18:06.528626 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:06.528922 master-0 kubenswrapper[16352]: E0307 21:18:06.528850 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 07 21:18:06.529026 master-0 kubenswrapper[16352]: E0307 21:18:06.528995 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls podName:9419e98f-3f8e-49d2-a8a2-945cc308f8b1 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:08.528965743 +0000 UTC m=+11.599670812 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-nvm8t" (UID: "9419e98f-3f8e-49d2-a8a2-945cc308f8b1") : secret "prometheus-operator-tls" not found Mar 07 21:18:06.720783 master-0 kubenswrapper[16352]: I0307 21:18:06.720725 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:18:06.774504 master-0 kubenswrapper[16352]: I0307 21:18:06.774350 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:18:06.880896 master-0 kubenswrapper[16352]: I0307 21:18:06.880796 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:06.909837 master-0 kubenswrapper[16352]: I0307 21:18:06.909777 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:07.139752 master-0 kubenswrapper[16352]: I0307 21:18:07.137382 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-2grlf"] Mar 07 21:18:07.141494 master-0 kubenswrapper[16352]: I0307 21:18:07.140886 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.143095 master-0 kubenswrapper[16352]: I0307 21:18:07.143056 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:07.143323 master-0 kubenswrapper[16352]: E0307 21:18:07.143293 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:07.143386 master-0 kubenswrapper[16352]: E0307 21:18:07.143321 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:07.143434 master-0 kubenswrapper[16352]: E0307 21:18:07.143388 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:11.143368652 +0000 UTC m=+14.214073721 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:07.151390 master-0 kubenswrapper[16352]: I0307 21:18:07.151243 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 21:18:07.151777 master-0 kubenswrapper[16352]: I0307 21:18:07.151669 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 21:18:07.152073 master-0 kubenswrapper[16352]: I0307 21:18:07.151987 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 21:18:07.152152 master-0 kubenswrapper[16352]: I0307 21:18:07.152105 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 21:18:07.153058 master-0 kubenswrapper[16352]: I0307 21:18:07.152322 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 21:18:07.171222 master-0 kubenswrapper[16352]: I0307 21:18:07.171162 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-2grlf"] Mar 07 21:18:07.244265 master-0 kubenswrapper[16352]: I0307 21:18:07.244204 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.244588 master-0 kubenswrapper[16352]: I0307 21:18:07.244571 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28fd6ebb-51ac-4763-99b2-3a94b124d059-serving-cert\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.245004 master-0 kubenswrapper[16352]: I0307 21:18:07.244946 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8vhb\" (UniqueName: \"kubernetes.io/projected/28fd6ebb-51ac-4763-99b2-3a94b124d059-kube-api-access-p8vhb\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.245078 master-0 kubenswrapper[16352]: I0307 21:18:07.245049 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-config\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.273101 master-0 kubenswrapper[16352]: I0307 21:18:07.272862 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:18:07.276670 master-0 kubenswrapper[16352]: I0307 21:18:07.276622 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:18:07.317183 master-0 kubenswrapper[16352]: I0307 21:18:07.317135 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:07.347243 master-0 kubenswrapper[16352]: I0307 21:18:07.347146 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.347243 master-0 kubenswrapper[16352]: I0307 21:18:07.347260 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28fd6ebb-51ac-4763-99b2-3a94b124d059-serving-cert\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.347493 master-0 kubenswrapper[16352]: I0307 21:18:07.347324 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p8vhb\" (UniqueName: \"kubernetes.io/projected/28fd6ebb-51ac-4763-99b2-3a94b124d059-kube-api-access-p8vhb\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.347493 master-0 kubenswrapper[16352]: I0307 21:18:07.347349 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-config\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.347493 master-0 kubenswrapper[16352]: E0307 21:18:07.347389 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:07.847361623 +0000 UTC m=+10.918066682 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:07.349147 master-0 kubenswrapper[16352]: I0307 21:18:07.348751 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-config\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.350752 master-0 kubenswrapper[16352]: I0307 21:18:07.350726 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/28fd6ebb-51ac-4763-99b2-3a94b124d059-serving-cert\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.376661 master-0 kubenswrapper[16352]: I0307 21:18:07.376581 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8vhb\" (UniqueName: \"kubernetes.io/projected/28fd6ebb-51ac-4763-99b2-3a94b124d059-kube-api-access-p8vhb\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.653220 master-0 kubenswrapper[16352]: I0307 21:18:07.653165 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:07.653574 master-0 kubenswrapper[16352]: E0307 21:18:07.653399 16352 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 07 21:18:07.653669 master-0 kubenswrapper[16352]: E0307 21:18:07.653644 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert podName:705bd6f8-6937-4a16-b03e-5ad3bc684a89 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:11.653616859 +0000 UTC m=+14.724321918 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert") pod "ingress-canary-7hzkm" (UID: "705bd6f8-6937-4a16-b03e-5ad3bc684a89") : secret "canary-serving-cert" not found Mar 07 21:18:07.856877 master-0 kubenswrapper[16352]: I0307 21:18:07.856750 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:07.857913 master-0 kubenswrapper[16352]: E0307 21:18:07.857060 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:08.857019916 +0000 UTC m=+11.927724975 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:07.952278 master-0 kubenswrapper[16352]: I0307 21:18:07.952125 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:07.952829 master-0 kubenswrapper[16352]: I0307 21:18:07.952799 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:07.963866 master-0 kubenswrapper[16352]: I0307 21:18:07.963803 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:18:08.326333 master-0 kubenswrapper[16352]: I0307 21:18:08.326168 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:08.492516 master-0 kubenswrapper[16352]: I0307 21:18:08.492440 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:18:08.492802 master-0 kubenswrapper[16352]: I0307 21:18:08.492651 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:08.498762 master-0 kubenswrapper[16352]: I0307 21:18:08.498656 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-79f8cd6fdd-858hg" Mar 07 21:18:08.568739 master-0 kubenswrapper[16352]: I0307 21:18:08.568586 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:08.570036 master-0 kubenswrapper[16352]: E0307 21:18:08.569525 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 07 21:18:08.570036 master-0 kubenswrapper[16352]: E0307 21:18:08.569625 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls podName:9419e98f-3f8e-49d2-a8a2-945cc308f8b1 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:12.569597932 +0000 UTC m=+15.640303001 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-nvm8t" (UID: "9419e98f-3f8e-49d2-a8a2-945cc308f8b1") : secret "prometheus-operator-tls" not found Mar 07 21:18:08.597338 master-0 kubenswrapper[16352]: I0307 21:18:08.597280 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-694d775589-btnh4" Mar 07 21:18:08.602716 master-0 kubenswrapper[16352]: I0307 21:18:08.602637 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-67cf6dffcb-4z6hx" Mar 07 21:18:08.657229 master-0 kubenswrapper[16352]: I0307 21:18:08.657083 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:18:08.763813 master-0 kubenswrapper[16352]: I0307 21:18:08.763073 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:18:08.874993 master-0 kubenswrapper[16352]: I0307 21:18:08.874777 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:08.875781 master-0 kubenswrapper[16352]: E0307 21:18:08.875102 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:10.875079031 +0000 UTC m=+13.945784100 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:09.596765 master-0 kubenswrapper[16352]: I0307 21:18:09.596645 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:18:09.632103 master-0 kubenswrapper[16352]: I0307 21:18:09.632025 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:18:09.670904 master-0 kubenswrapper[16352]: I0307 21:18:09.670181 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:18:09.677975 master-0 kubenswrapper[16352]: I0307 21:18:09.677920 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:18:09.682846 master-0 kubenswrapper[16352]: I0307 21:18:09.682753 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:18:09.683823 master-0 kubenswrapper[16352]: I0307 21:18:09.683648 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rw59s" Mar 07 21:18:09.968666 master-0 kubenswrapper[16352]: I0307 21:18:09.968435 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:18:10.032149 master-0 kubenswrapper[16352]: I0307 21:18:10.032073 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2cc9" Mar 07 21:18:10.250784 master-0 kubenswrapper[16352]: I0307 21:18:10.250624 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:10.251071 master-0 kubenswrapper[16352]: I0307 21:18:10.251039 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:18:10.278028 master-0 kubenswrapper[16352]: I0307 21:18:10.277922 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-x9v76" Mar 07 21:18:10.289225 master-0 kubenswrapper[16352]: I0307 21:18:10.289056 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:18:10.293103 master-0 kubenswrapper[16352]: I0307 21:18:10.293049 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:18:10.542267 master-0 kubenswrapper[16352]: I0307 21:18:10.542086 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:18:10.552474 master-0 kubenswrapper[16352]: I0307 21:18:10.552375 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" Mar 07 21:18:10.745105 master-0 kubenswrapper[16352]: I0307 21:18:10.745030 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:18:10.804061 master-0 kubenswrapper[16352]: I0307 21:18:10.803901 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-fdltd" Mar 07 21:18:10.915892 master-0 kubenswrapper[16352]: I0307 21:18:10.915800 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:10.916152 master-0 kubenswrapper[16352]: E0307 21:18:10.916087 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:14.916048187 +0000 UTC m=+17.986753436 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:11.220421 master-0 kubenswrapper[16352]: I0307 21:18:11.220354 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:11.221109 master-0 kubenswrapper[16352]: E0307 21:18:11.220605 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:11.221109 master-0 kubenswrapper[16352]: E0307 21:18:11.220658 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:11.221109 master-0 kubenswrapper[16352]: E0307 21:18:11.220762 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:19.220735076 +0000 UTC m=+22.291440325 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:11.239878 master-0 kubenswrapper[16352]: I0307 21:18:11.239812 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:18:11.240199 master-0 kubenswrapper[16352]: I0307 21:18:11.240061 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" containerID="cri-o://d24d032319a9f87acbbf34deb36cb14122c07e93e1e3dd0d42d28beaf572ecc6" gracePeriod=5 Mar 07 21:18:11.729633 master-0 kubenswrapper[16352]: I0307 21:18:11.729558 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:11.729955 master-0 kubenswrapper[16352]: E0307 21:18:11.729805 16352 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Mar 07 21:18:11.729955 master-0 kubenswrapper[16352]: E0307 21:18:11.729917 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert podName:705bd6f8-6937-4a16-b03e-5ad3bc684a89 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:19.729892497 +0000 UTC m=+22.800597556 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert") pod "ingress-canary-7hzkm" (UID: "705bd6f8-6937-4a16-b03e-5ad3bc684a89") : secret "canary-serving-cert" not found Mar 07 21:18:12.641863 master-0 kubenswrapper[16352]: I0307 21:18:12.641795 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:12.642542 master-0 kubenswrapper[16352]: E0307 21:18:12.641960 16352 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Mar 07 21:18:12.642542 master-0 kubenswrapper[16352]: E0307 21:18:12.642036 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls podName:9419e98f-3f8e-49d2-a8a2-945cc308f8b1 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:20.642016466 +0000 UTC m=+23.712721525 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls") pod "prometheus-operator-5ff8674d55-nvm8t" (UID: "9419e98f-3f8e-49d2-a8a2-945cc308f8b1") : secret "prometheus-operator-tls" not found Mar 07 21:18:12.660545 master-0 kubenswrapper[16352]: I0307 21:18:12.660475 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:18:12.666325 master-0 kubenswrapper[16352]: I0307 21:18:12.666292 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-d64cfc9db-qd6xh" Mar 07 21:18:13.411722 master-0 kubenswrapper[16352]: I0307 21:18:13.411238 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:18:13.418119 master-0 kubenswrapper[16352]: I0307 21:18:13.418063 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:18:14.987117 master-0 kubenswrapper[16352]: I0307 21:18:14.987049 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:14.988880 master-0 kubenswrapper[16352]: E0307 21:18:14.988862 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:22.98884436 +0000 UTC m=+26.059549419 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:16.379134 master-0 kubenswrapper[16352]: I0307 21:18:16.379015 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 07 21:18:16.379134 master-0 kubenswrapper[16352]: I0307 21:18:16.379063 16352 generic.go:334] "Generic (PLEG): container finished" podID="f417e14665db2ffffa887ce21c9ff0ed" containerID="d24d032319a9f87acbbf34deb36cb14122c07e93e1e3dd0d42d28beaf572ecc6" exitCode=137 Mar 07 21:18:16.379134 master-0 kubenswrapper[16352]: I0307 21:18:16.379101 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6c2e629c1b173cac44718a698c182b6cdb51f19fbab8e65d985e07288b0f174" Mar 07 21:18:16.394370 master-0 kubenswrapper[16352]: I0307 21:18:16.394338 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_f417e14665db2ffffa887ce21c9ff0ed/startup-monitor/0.log" Mar 07 21:18:16.394631 master-0 kubenswrapper[16352]: I0307 21:18:16.394618 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:18:16.515372 master-0 kubenswrapper[16352]: I0307 21:18:16.515236 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 07 21:18:16.515656 master-0 kubenswrapper[16352]: I0307 21:18:16.515409 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:18:16.515731 master-0 kubenswrapper[16352]: I0307 21:18:16.515603 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 07 21:18:16.515793 master-0 kubenswrapper[16352]: I0307 21:18:16.515767 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 07 21:18:16.515832 master-0 kubenswrapper[16352]: I0307 21:18:16.515815 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 07 21:18:16.515899 master-0 kubenswrapper[16352]: I0307 21:18:16.515880 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests" (OuterVolumeSpecName: "manifests") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:18:16.515969 master-0 kubenswrapper[16352]: I0307 21:18:16.515942 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log" (OuterVolumeSpecName: "var-log") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:18:16.516008 master-0 kubenswrapper[16352]: I0307 21:18:16.515954 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") pod \"f417e14665db2ffffa887ce21c9ff0ed\" (UID: \"f417e14665db2ffffa887ce21c9ff0ed\") " Mar 07 21:18:16.516008 master-0 kubenswrapper[16352]: I0307 21:18:16.515982 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:18:16.516605 master-0 kubenswrapper[16352]: I0307 21:18:16.516525 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:18:16.516605 master-0 kubenswrapper[16352]: I0307 21:18:16.516548 16352 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-manifests\") on node \"master-0\" DevicePath \"\"" Mar 07 21:18:16.516605 master-0 kubenswrapper[16352]: I0307 21:18:16.516562 16352 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-var-log\") on node \"master-0\" DevicePath \"\"" Mar 07 21:18:16.516605 master-0 kubenswrapper[16352]: I0307 21:18:16.516575 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:18:16.524590 master-0 kubenswrapper[16352]: I0307 21:18:16.523899 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f417e14665db2ffffa887ce21c9ff0ed" (UID: "f417e14665db2ffffa887ce21c9ff0ed"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:18:16.617620 master-0 kubenswrapper[16352]: I0307 21:18:16.617556 16352 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f417e14665db2ffffa887ce21c9ff0ed-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:18:17.198277 master-0 kubenswrapper[16352]: I0307 21:18:17.198203 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f417e14665db2ffffa887ce21c9ff0ed" path="/var/lib/kubelet/pods/f417e14665db2ffffa887ce21c9ff0ed/volumes" Mar 07 21:18:17.198528 master-0 kubenswrapper[16352]: I0307 21:18:17.198486 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 07 21:18:17.229700 master-0 kubenswrapper[16352]: I0307 21:18:17.229433 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:18:17.229700 master-0 kubenswrapper[16352]: I0307 21:18:17.229473 16352 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="0ea92965-94e9-41d6-aa55-ca23685450ed" Mar 07 21:18:17.232589 master-0 kubenswrapper[16352]: I0307 21:18:17.232539 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:18:17.232653 master-0 kubenswrapper[16352]: I0307 21:18:17.232592 16352 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="0ea92965-94e9-41d6-aa55-ca23685450ed" Mar 07 21:18:17.385956 master-0 kubenswrapper[16352]: I0307 21:18:17.385886 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:18:19.260165 master-0 kubenswrapper[16352]: I0307 21:18:19.260087 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:19.260827 master-0 kubenswrapper[16352]: E0307 21:18:19.260335 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:19.260827 master-0 kubenswrapper[16352]: E0307 21:18:19.260383 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:19.260827 master-0 kubenswrapper[16352]: E0307 21:18:19.260457 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:18:35.260430698 +0000 UTC m=+38.331135757 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:19.768382 master-0 kubenswrapper[16352]: I0307 21:18:19.768303 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:19.772781 master-0 kubenswrapper[16352]: I0307 21:18:19.772725 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/705bd6f8-6937-4a16-b03e-5ad3bc684a89-cert\") pod \"ingress-canary-7hzkm\" (UID: \"705bd6f8-6937-4a16-b03e-5ad3bc684a89\") " pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:19.794611 master-0 kubenswrapper[16352]: I0307 21:18:19.794546 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-7hzkm" Mar 07 21:18:20.221337 master-0 kubenswrapper[16352]: I0307 21:18:20.221276 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-7hzkm"] Mar 07 21:18:20.406834 master-0 kubenswrapper[16352]: I0307 21:18:20.406766 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7hzkm" event={"ID":"705bd6f8-6937-4a16-b03e-5ad3bc684a89","Type":"ContainerStarted","Data":"ff49956acc93e8225f130f4b91d86fc2aac8365ea47e96d4a9bd561a43e93bae"} Mar 07 21:18:20.406834 master-0 kubenswrapper[16352]: I0307 21:18:20.406820 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-7hzkm" event={"ID":"705bd6f8-6937-4a16-b03e-5ad3bc684a89","Type":"ContainerStarted","Data":"2deb7e132f93dc511f667506e24fb2d7772bbcd564d4ff3c7b4bba7c143cd295"} Mar 07 21:18:20.425465 master-0 kubenswrapper[16352]: I0307 21:18:20.425351 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-7hzkm" podStartSLOduration=17.425320915 podStartE2EDuration="17.425320915s" podCreationTimestamp="2026-03-07 21:18:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:18:20.423943912 +0000 UTC m=+23.494648971" watchObservedRunningTime="2026-03-07 21:18:20.425320915 +0000 UTC m=+23.496026004" Mar 07 21:18:20.682784 master-0 kubenswrapper[16352]: I0307 21:18:20.682670 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:20.687748 master-0 kubenswrapper[16352]: I0307 21:18:20.687672 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/9419e98f-3f8e-49d2-a8a2-945cc308f8b1-prometheus-operator-tls\") pod \"prometheus-operator-5ff8674d55-nvm8t\" (UID: \"9419e98f-3f8e-49d2-a8a2-945cc308f8b1\") " pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:20.746704 master-0 kubenswrapper[16352]: I0307 21:18:20.746601 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" Mar 07 21:18:21.195326 master-0 kubenswrapper[16352]: I0307 21:18:21.195057 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t"] Mar 07 21:18:21.196597 master-0 kubenswrapper[16352]: W0307 21:18:21.196534 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9419e98f_3f8e_49d2_a8a2_945cc308f8b1.slice/crio-9ad5ff9bf4da7d4fd72aaf3e7e07c3dcdf799e9d01036ce29b8b04494545a5b1 WatchSource:0}: Error finding container 9ad5ff9bf4da7d4fd72aaf3e7e07c3dcdf799e9d01036ce29b8b04494545a5b1: Status 404 returned error can't find the container with id 9ad5ff9bf4da7d4fd72aaf3e7e07c3dcdf799e9d01036ce29b8b04494545a5b1 Mar 07 21:18:21.198874 master-0 kubenswrapper[16352]: I0307 21:18:21.198828 16352 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:18:21.415586 master-0 kubenswrapper[16352]: I0307 21:18:21.414914 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" event={"ID":"9419e98f-3f8e-49d2-a8a2-945cc308f8b1","Type":"ContainerStarted","Data":"9ad5ff9bf4da7d4fd72aaf3e7e07c3dcdf799e9d01036ce29b8b04494545a5b1"} Mar 07 21:18:23.023572 master-0 kubenswrapper[16352]: I0307 21:18:23.023424 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:23.024862 master-0 kubenswrapper[16352]: E0307 21:18:23.023662 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:39.023626535 +0000 UTC m=+42.094331614 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:24.446389 master-0 kubenswrapper[16352]: I0307 21:18:24.446255 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" event={"ID":"9419e98f-3f8e-49d2-a8a2-945cc308f8b1","Type":"ContainerStarted","Data":"c167bf0527119828066e53508133809ed8db2ebcce1541250f59d80c8a727afa"} Mar 07 21:18:24.446389 master-0 kubenswrapper[16352]: I0307 21:18:24.446383 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" event={"ID":"9419e98f-3f8e-49d2-a8a2-945cc308f8b1","Type":"ContainerStarted","Data":"dd3c0c19a2ac87f691c97b2b3962cea7aa9f3abbd51a84b6fe691e25ccf7ab35"} Mar 07 21:18:24.469775 master-0 kubenswrapper[16352]: I0307 21:18:24.469654 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5ff8674d55-nvm8t" podStartSLOduration=18.121886854 podStartE2EDuration="20.46962697s" podCreationTimestamp="2026-03-07 21:18:04 +0000 UTC" firstStartedPulling="2026-03-07 21:18:21.198745819 +0000 UTC m=+24.269450878" lastFinishedPulling="2026-03-07 21:18:23.546485915 +0000 UTC m=+26.617190994" observedRunningTime="2026-03-07 21:18:24.468853042 +0000 UTC m=+27.539558141" watchObservedRunningTime="2026-03-07 21:18:24.46962697 +0000 UTC m=+27.540332059" Mar 07 21:18:26.216721 master-0 kubenswrapper[16352]: I0307 21:18:26.214672 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c8pdj"] Mar 07 21:18:26.221105 master-0 kubenswrapper[16352]: E0307 21:18:26.220937 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 07 21:18:26.221105 master-0 kubenswrapper[16352]: I0307 21:18:26.220981 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 07 21:18:26.221447 master-0 kubenswrapper[16352]: I0307 21:18:26.221390 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f417e14665db2ffffa887ce21c9ff0ed" containerName="startup-monitor" Mar 07 21:18:26.235490 master-0 kubenswrapper[16352]: I0307 21:18:26.235425 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.239170 master-0 kubenswrapper[16352]: I0307 21:18:26.239107 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 07 21:18:26.240111 master-0 kubenswrapper[16352]: I0307 21:18:26.239460 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 07 21:18:26.240111 master-0 kubenswrapper[16352]: I0307 21:18:26.239640 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-jtttv" Mar 07 21:18:26.240111 master-0 kubenswrapper[16352]: I0307 21:18:26.239792 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r"] Mar 07 21:18:26.242373 master-0 kubenswrapper[16352]: I0307 21:18:26.241788 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66"] Mar 07 21:18:26.243069 master-0 kubenswrapper[16352]: I0307 21:18:26.242807 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.244016 master-0 kubenswrapper[16352]: I0307 21:18:26.243933 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.248268 master-0 kubenswrapper[16352]: I0307 21:18:26.246960 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-7982x" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250116 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250358 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250480 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250616 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-xvgqm" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250771 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 07 21:18:26.253045 master-0 kubenswrapper[16352]: I0307 21:18:26.250908 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 07 21:18:26.259296 master-0 kubenswrapper[16352]: I0307 21:18:26.257537 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r"] Mar 07 21:18:26.264179 master-0 kubenswrapper[16352]: I0307 21:18:26.264125 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66"] Mar 07 21:18:26.385740 master-0 kubenswrapper[16352]: I0307 21:18:26.385096 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-tls\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.385947 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-wtmp\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386034 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386058 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdjtj\" (UniqueName: \"kubernetes.io/projected/f9010245-1668-4f71-979a-077fb174a8d5-kube-api-access-sdjtj\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386077 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c32f43eb-a408-493b-bf11-9386ef49cf68-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386146 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea0c0a12-3b05-45aa-900f-e1757ece41a1-metrics-client-ca\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386165 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-textfile\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386191 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386238 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjv5r\" (UniqueName: \"kubernetes.io/projected/ea0c0a12-3b05-45aa-900f-e1757ece41a1-kube-api-access-gjv5r\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386295 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-sys\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386305 master-0 kubenswrapper[16352]: I0307 21:18:26.386313 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-root\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386343 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386378 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386416 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cm7jp\" (UniqueName: \"kubernetes.io/projected/c32f43eb-a408-493b-bf11-9386ef49cf68-kube-api-access-cm7jp\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386434 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386455 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386477 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.386851 master-0 kubenswrapper[16352]: I0307 21:18:26.386502 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9010245-1668-4f71-979a-077fb174a8d5-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.488780 master-0 kubenswrapper[16352]: I0307 21:18:26.488600 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-wtmp\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488793 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488837 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sdjtj\" (UniqueName: \"kubernetes.io/projected/f9010245-1668-4f71-979a-077fb174a8d5-kube-api-access-sdjtj\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488874 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c32f43eb-a408-493b-bf11-9386ef49cf68-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488920 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-wtmp\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488951 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea0c0a12-3b05-45aa-900f-e1757ece41a1-metrics-client-ca\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.488991 master-0 kubenswrapper[16352]: I0307 21:18:26.488988 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-textfile\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.489172 master-0 kubenswrapper[16352]: I0307 21:18:26.489025 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.489400 master-0 kubenswrapper[16352]: I0307 21:18:26.489370 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjv5r\" (UniqueName: \"kubernetes.io/projected/ea0c0a12-3b05-45aa-900f-e1757ece41a1-kube-api-access-gjv5r\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.489436 master-0 kubenswrapper[16352]: I0307 21:18:26.489428 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-sys\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.489468 master-0 kubenswrapper[16352]: I0307 21:18:26.489445 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-root\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.489524 master-0 kubenswrapper[16352]: I0307 21:18:26.489490 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.489753 master-0 kubenswrapper[16352]: I0307 21:18:26.489670 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-sys\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.489874 master-0 kubenswrapper[16352]: I0307 21:18:26.489822 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.489965 master-0 kubenswrapper[16352]: I0307 21:18:26.489946 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c32f43eb-a408-493b-bf11-9386ef49cf68-metrics-client-ca\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.490046 master-0 kubenswrapper[16352]: I0307 21:18:26.489955 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cm7jp\" (UniqueName: \"kubernetes.io/projected/c32f43eb-a408-493b-bf11-9386ef49cf68-kube-api-access-cm7jp\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.490046 master-0 kubenswrapper[16352]: I0307 21:18:26.489948 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/ea0c0a12-3b05-45aa-900f-e1757ece41a1-root\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.490123 master-0 kubenswrapper[16352]: I0307 21:18:26.490016 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-textfile\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.490181 master-0 kubenswrapper[16352]: I0307 21:18:26.490131 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.490323 master-0 kubenswrapper[16352]: I0307 21:18:26.490264 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.490385 master-0 kubenswrapper[16352]: I0307 21:18:26.490361 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.490455 master-0 kubenswrapper[16352]: I0307 21:18:26.490425 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.490517 master-0 kubenswrapper[16352]: I0307 21:18:26.490465 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ea0c0a12-3b05-45aa-900f-e1757ece41a1-metrics-client-ca\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.490586 master-0 kubenswrapper[16352]: I0307 21:18:26.490476 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9010245-1668-4f71-979a-077fb174a8d5-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.490747 master-0 kubenswrapper[16352]: I0307 21:18:26.490715 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-tls\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.492009 master-0 kubenswrapper[16352]: I0307 21:18:26.491974 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/f9010245-1668-4f71-979a-077fb174a8d5-metrics-client-ca\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.492060 master-0 kubenswrapper[16352]: I0307 21:18:26.492036 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/f9010245-1668-4f71-979a-077fb174a8d5-volume-directive-shadow\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.494336 master-0 kubenswrapper[16352]: I0307 21:18:26.494293 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-tls\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.494439 master-0 kubenswrapper[16352]: I0307 21:18:26.494417 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.495504 master-0 kubenswrapper[16352]: I0307 21:18:26.495474 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/ea0c0a12-3b05-45aa-900f-e1757ece41a1-node-exporter-tls\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.496140 master-0 kubenswrapper[16352]: I0307 21:18:26.496074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/f9010245-1668-4f71-979a-077fb174a8d5-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.499653 master-0 kubenswrapper[16352]: I0307 21:18:26.499595 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.500901 master-0 kubenswrapper[16352]: I0307 21:18:26.500848 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/c32f43eb-a408-493b-bf11-9386ef49cf68-openshift-state-metrics-tls\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.506038 master-0 kubenswrapper[16352]: I0307 21:18:26.505997 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdjtj\" (UniqueName: \"kubernetes.io/projected/f9010245-1668-4f71-979a-077fb174a8d5-kube-api-access-sdjtj\") pod \"kube-state-metrics-68b88f8cb5-5cj66\" (UID: \"f9010245-1668-4f71-979a-077fb174a8d5\") " pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:26.511286 master-0 kubenswrapper[16352]: I0307 21:18:26.511224 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjv5r\" (UniqueName: \"kubernetes.io/projected/ea0c0a12-3b05-45aa-900f-e1757ece41a1-kube-api-access-gjv5r\") pod \"node-exporter-c8pdj\" (UID: \"ea0c0a12-3b05-45aa-900f-e1757ece41a1\") " pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.520286 master-0 kubenswrapper[16352]: I0307 21:18:26.520178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cm7jp\" (UniqueName: \"kubernetes.io/projected/c32f43eb-a408-493b-bf11-9386ef49cf68-kube-api-access-cm7jp\") pod \"openshift-state-metrics-74cc79fd76-84z7r\" (UID: \"c32f43eb-a408-493b-bf11-9386ef49cf68\") " pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.558674 master-0 kubenswrapper[16352]: I0307 21:18:26.558592 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c8pdj" Mar 07 21:18:26.580548 master-0 kubenswrapper[16352]: I0307 21:18:26.580451 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" Mar 07 21:18:26.616906 master-0 kubenswrapper[16352]: I0307 21:18:26.616852 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" Mar 07 21:18:27.089534 master-0 kubenswrapper[16352]: I0307 21:18:27.089382 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r"] Mar 07 21:18:27.098914 master-0 kubenswrapper[16352]: W0307 21:18:27.098860 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc32f43eb_a408_493b_bf11_9386ef49cf68.slice/crio-de3de8037bd740b49252b4e554557bde52dd08aff6fec7dedf0d28f28ddbb538 WatchSource:0}: Error finding container de3de8037bd740b49252b4e554557bde52dd08aff6fec7dedf0d28f28ddbb538: Status 404 returned error can't find the container with id de3de8037bd740b49252b4e554557bde52dd08aff6fec7dedf0d28f28ddbb538 Mar 07 21:18:27.138312 master-0 kubenswrapper[16352]: I0307 21:18:27.138263 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66"] Mar 07 21:18:27.143883 master-0 kubenswrapper[16352]: W0307 21:18:27.143831 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf9010245_1668_4f71_979a_077fb174a8d5.slice/crio-7decd66a645a0b0f6ef04434daabb1fbf7ef495e06fe0338386cbdbbff6cfa58 WatchSource:0}: Error finding container 7decd66a645a0b0f6ef04434daabb1fbf7ef495e06fe0338386cbdbbff6cfa58: Status 404 returned error can't find the container with id 7decd66a645a0b0f6ef04434daabb1fbf7ef495e06fe0338386cbdbbff6cfa58 Mar 07 21:18:27.479656 master-0 kubenswrapper[16352]: I0307 21:18:27.479502 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:18:27.489867 master-0 kubenswrapper[16352]: I0307 21:18:27.489801 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.494181 master-0 kubenswrapper[16352]: I0307 21:18:27.494136 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 07 21:18:27.494279 master-0 kubenswrapper[16352]: I0307 21:18:27.494206 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 07 21:18:27.494600 master-0 kubenswrapper[16352]: I0307 21:18:27.494569 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 07 21:18:27.499558 master-0 kubenswrapper[16352]: I0307 21:18:27.498972 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" event={"ID":"c32f43eb-a408-493b-bf11-9386ef49cf68","Type":"ContainerStarted","Data":"e014a20b98fa2051d6fe299712baba6ed92f2178181bf65f28eb1344dd413af2"} Mar 07 21:18:27.499558 master-0 kubenswrapper[16352]: I0307 21:18:27.499039 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" event={"ID":"c32f43eb-a408-493b-bf11-9386ef49cf68","Type":"ContainerStarted","Data":"5a772f72af62a1e009df81bc6672d3fce88f8496536b9f6a85e29781e501f9c4"} Mar 07 21:18:27.499558 master-0 kubenswrapper[16352]: I0307 21:18:27.499055 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" event={"ID":"c32f43eb-a408-493b-bf11-9386ef49cf68","Type":"ContainerStarted","Data":"de3de8037bd740b49252b4e554557bde52dd08aff6fec7dedf0d28f28ddbb538"} Mar 07 21:18:27.502099 master-0 kubenswrapper[16352]: I0307 21:18:27.501927 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" event={"ID":"f9010245-1668-4f71-979a-077fb174a8d5","Type":"ContainerStarted","Data":"7decd66a645a0b0f6ef04434daabb1fbf7ef495e06fe0338386cbdbbff6cfa58"} Mar 07 21:18:27.503139 master-0 kubenswrapper[16352]: I0307 21:18:27.503104 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c8pdj" event={"ID":"ea0c0a12-3b05-45aa-900f-e1757ece41a1","Type":"ContainerStarted","Data":"ef080581542b0d361e0ea235aea8516f835f98ab32ea0ff29c756552722693d0"} Mar 07 21:18:27.506756 master-0 kubenswrapper[16352]: I0307 21:18:27.505237 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-7bx66" Mar 07 21:18:27.506756 master-0 kubenswrapper[16352]: I0307 21:18:27.505569 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 07 21:18:27.506756 master-0 kubenswrapper[16352]: I0307 21:18:27.506504 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 07 21:18:27.506756 master-0 kubenswrapper[16352]: I0307 21:18:27.506576 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 07 21:18:27.507133 master-0 kubenswrapper[16352]: I0307 21:18:27.507017 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 07 21:18:27.507439 master-0 kubenswrapper[16352]: I0307 21:18:27.507394 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 07 21:18:27.534438 master-0 kubenswrapper[16352]: I0307 21:18:27.525598 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.625877 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.625931 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.625980 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.625997 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626036 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626057 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626075 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw7xl\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626105 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626124 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626149 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626167 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.626789 master-0 kubenswrapper[16352]: I0307 21:18:27.626186 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729181 master-0 kubenswrapper[16352]: I0307 21:18:27.729093 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729574 master-0 kubenswrapper[16352]: I0307 21:18:27.729496 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729745 master-0 kubenswrapper[16352]: I0307 21:18:27.729717 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729806 master-0 kubenswrapper[16352]: I0307 21:18:27.729755 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729881 master-0 kubenswrapper[16352]: I0307 21:18:27.729857 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.729938 master-0 kubenswrapper[16352]: I0307 21:18:27.729908 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730423 master-0 kubenswrapper[16352]: I0307 21:18:27.730394 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw7xl\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730541 master-0 kubenswrapper[16352]: I0307 21:18:27.730498 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730541 master-0 kubenswrapper[16352]: I0307 21:18:27.730512 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730701 master-0 kubenswrapper[16352]: I0307 21:18:27.730591 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730761 master-0 kubenswrapper[16352]: I0307 21:18:27.730706 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730761 master-0 kubenswrapper[16352]: I0307 21:18:27.730739 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.730877 master-0 kubenswrapper[16352]: E0307 21:18:27.730805 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:28.230754578 +0000 UTC m=+31.301459807 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:27.730956 master-0 kubenswrapper[16352]: I0307 21:18:27.730899 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.733057 master-0 kubenswrapper[16352]: I0307 21:18:27.733013 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.734545 master-0 kubenswrapper[16352]: I0307 21:18:27.734502 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.735100 master-0 kubenswrapper[16352]: I0307 21:18:27.735063 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.741945 master-0 kubenswrapper[16352]: I0307 21:18:27.741894 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.742102 master-0 kubenswrapper[16352]: I0307 21:18:27.742019 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.742143 master-0 kubenswrapper[16352]: I0307 21:18:27.742030 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.750020 master-0 kubenswrapper[16352]: I0307 21:18:27.749934 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.750447 master-0 kubenswrapper[16352]: I0307 21:18:27.750393 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.750559 master-0 kubenswrapper[16352]: I0307 21:18:27.750528 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:27.761612 master-0 kubenswrapper[16352]: I0307 21:18:27.761224 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw7xl\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:28.241438 master-0 kubenswrapper[16352]: I0307 21:18:28.240325 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:28.241438 master-0 kubenswrapper[16352]: E0307 21:18:28.240632 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:29.240607526 +0000 UTC m=+32.311312615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:28.348634 master-0 kubenswrapper[16352]: I0307 21:18:28.347247 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-9995cd46f-q546g"] Mar 07 21:18:28.349255 master-0 kubenswrapper[16352]: I0307 21:18:28.349233 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.354014 master-0 kubenswrapper[16352]: I0307 21:18:28.353728 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-k9ktl35kg68d" Mar 07 21:18:28.354014 master-0 kubenswrapper[16352]: I0307 21:18:28.353940 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 07 21:18:28.354199 master-0 kubenswrapper[16352]: I0307 21:18:28.354091 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 07 21:18:28.354199 master-0 kubenswrapper[16352]: I0307 21:18:28.354152 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-t2xgn" Mar 07 21:18:28.354256 master-0 kubenswrapper[16352]: I0307 21:18:28.354197 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 07 21:18:28.354431 master-0 kubenswrapper[16352]: I0307 21:18:28.354220 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 07 21:18:28.360977 master-0 kubenswrapper[16352]: I0307 21:18:28.358040 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 07 21:18:28.384030 master-0 kubenswrapper[16352]: I0307 21:18:28.383519 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9995cd46f-q546g"] Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.444716 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-grpc-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.444882 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f2abd4-a05b-4063-9e15-d66d5813dd5e-metrics-client-ca\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.444929 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7x2qq\" (UniqueName: \"kubernetes.io/projected/02f2abd4-a05b-4063-9e15-d66d5813dd5e-kube-api-access-7x2qq\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.445145 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.445206 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.445235 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.445709 master-0 kubenswrapper[16352]: I0307 21:18:28.445438 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.446089 master-0 kubenswrapper[16352]: I0307 21:18:28.445770 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.546952 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7x2qq\" (UniqueName: \"kubernetes.io/projected/02f2abd4-a05b-4063-9e15-d66d5813dd5e-kube-api-access-7x2qq\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547034 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547057 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547075 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547121 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547146 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547173 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-grpc-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.547218 master-0 kubenswrapper[16352]: I0307 21:18:28.547207 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f2abd4-a05b-4063-9e15-d66d5813dd5e-metrics-client-ca\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.549084 master-0 kubenswrapper[16352]: I0307 21:18:28.548095 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/02f2abd4-a05b-4063-9e15-d66d5813dd5e-metrics-client-ca\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.551459 master-0 kubenswrapper[16352]: I0307 21:18:28.551400 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.552052 master-0 kubenswrapper[16352]: I0307 21:18:28.552001 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.553576 master-0 kubenswrapper[16352]: I0307 21:18:28.553517 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.556082 master-0 kubenswrapper[16352]: I0307 21:18:28.556031 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.556472 master-0 kubenswrapper[16352]: I0307 21:18:28.556429 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.556851 master-0 kubenswrapper[16352]: I0307 21:18:28.556805 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/02f2abd4-a05b-4063-9e15-d66d5813dd5e-secret-grpc-tls\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.572431 master-0 kubenswrapper[16352]: I0307 21:18:28.570777 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7x2qq\" (UniqueName: \"kubernetes.io/projected/02f2abd4-a05b-4063-9e15-d66d5813dd5e-kube-api-access-7x2qq\") pod \"thanos-querier-9995cd46f-q546g\" (UID: \"02f2abd4-a05b-4063-9e15-d66d5813dd5e\") " pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:28.726736 master-0 kubenswrapper[16352]: I0307 21:18:28.726284 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:29.257769 master-0 kubenswrapper[16352]: I0307 21:18:29.257697 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:29.258101 master-0 kubenswrapper[16352]: E0307 21:18:29.257919 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:31.257898751 +0000 UTC m=+34.328603830 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:29.533143 master-0 kubenswrapper[16352]: I0307 21:18:29.533043 16352 generic.go:334] "Generic (PLEG): container finished" podID="ea0c0a12-3b05-45aa-900f-e1757ece41a1" containerID="3e1cd05bc2fdaaaf55a039ed228f66998b33e9bec6bb2617abdd18108d190f21" exitCode=0 Mar 07 21:18:29.533436 master-0 kubenswrapper[16352]: I0307 21:18:29.533148 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c8pdj" event={"ID":"ea0c0a12-3b05-45aa-900f-e1757ece41a1","Type":"ContainerDied","Data":"3e1cd05bc2fdaaaf55a039ed228f66998b33e9bec6bb2617abdd18108d190f21"} Mar 07 21:18:30.126114 master-0 kubenswrapper[16352]: I0307 21:18:30.125957 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-9995cd46f-q546g"] Mar 07 21:18:30.545384 master-0 kubenswrapper[16352]: I0307 21:18:30.545322 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" event={"ID":"f9010245-1668-4f71-979a-077fb174a8d5","Type":"ContainerStarted","Data":"564fefeabc8c1d127c15d776611ccacf1273e795a247890db8b9b1603d9f77f8"} Mar 07 21:18:30.545384 master-0 kubenswrapper[16352]: I0307 21:18:30.545390 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" event={"ID":"f9010245-1668-4f71-979a-077fb174a8d5","Type":"ContainerStarted","Data":"f7b935e38196eeeda87671e74a74f80e0342b088d5e7f1eb5305a452c801a036"} Mar 07 21:18:30.545700 master-0 kubenswrapper[16352]: I0307 21:18:30.545411 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" event={"ID":"f9010245-1668-4f71-979a-077fb174a8d5","Type":"ContainerStarted","Data":"bacc3751bd605421a95d6e019698708df7ac8ce4c0eba6b3c18aaa33e559261a"} Mar 07 21:18:30.549135 master-0 kubenswrapper[16352]: I0307 21:18:30.549097 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c8pdj" event={"ID":"ea0c0a12-3b05-45aa-900f-e1757ece41a1","Type":"ContainerStarted","Data":"cb35bd4f0dc58c7d4986853e79c8d972ad0ff6492b69f67982209a728c23c8b9"} Mar 07 21:18:30.549295 master-0 kubenswrapper[16352]: I0307 21:18:30.549145 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c8pdj" event={"ID":"ea0c0a12-3b05-45aa-900f-e1757ece41a1","Type":"ContainerStarted","Data":"580a01239d9a9938e586f4cd3dbe375ac5c743a010b2052fdb09cdb809ecb318"} Mar 07 21:18:30.550810 master-0 kubenswrapper[16352]: I0307 21:18:30.550774 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"6e85bee00a0cf986be77788802a133d3343d593eb4598193ca7675884791e7ac"} Mar 07 21:18:30.553340 master-0 kubenswrapper[16352]: I0307 21:18:30.553318 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" event={"ID":"c32f43eb-a408-493b-bf11-9386ef49cf68","Type":"ContainerStarted","Data":"6433e7b4e049f3f187eecb991f98158aa46391c3723c1486e719c8eb3f5a24b9"} Mar 07 21:18:30.573272 master-0 kubenswrapper[16352]: I0307 21:18:30.573163 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-68b88f8cb5-5cj66" podStartSLOduration=2.044066534 podStartE2EDuration="4.573138025s" podCreationTimestamp="2026-03-07 21:18:26 +0000 UTC" firstStartedPulling="2026-03-07 21:18:27.146394023 +0000 UTC m=+30.217099082" lastFinishedPulling="2026-03-07 21:18:29.675465514 +0000 UTC m=+32.746170573" observedRunningTime="2026-03-07 21:18:30.570112782 +0000 UTC m=+33.640817851" watchObservedRunningTime="2026-03-07 21:18:30.573138025 +0000 UTC m=+33.643843094" Mar 07 21:18:30.616369 master-0 kubenswrapper[16352]: I0307 21:18:30.614778 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c8pdj" podStartSLOduration=2.900615295 podStartE2EDuration="4.614742138s" podCreationTimestamp="2026-03-07 21:18:26 +0000 UTC" firstStartedPulling="2026-03-07 21:18:26.598516029 +0000 UTC m=+29.669221098" lastFinishedPulling="2026-03-07 21:18:28.312642882 +0000 UTC m=+31.383347941" observedRunningTime="2026-03-07 21:18:30.603403964 +0000 UTC m=+33.674109063" watchObservedRunningTime="2026-03-07 21:18:30.614742138 +0000 UTC m=+33.685447217" Mar 07 21:18:30.638400 master-0 kubenswrapper[16352]: I0307 21:18:30.637735 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-74cc79fd76-84z7r" podStartSLOduration=2.338026505 podStartE2EDuration="4.637708592s" podCreationTimestamp="2026-03-07 21:18:26 +0000 UTC" firstStartedPulling="2026-03-07 21:18:27.376466463 +0000 UTC m=+30.447171522" lastFinishedPulling="2026-03-07 21:18:29.67614855 +0000 UTC m=+32.746853609" observedRunningTime="2026-03-07 21:18:30.637312343 +0000 UTC m=+33.708017452" watchObservedRunningTime="2026-03-07 21:18:30.637708592 +0000 UTC m=+33.708413651" Mar 07 21:18:31.304943 master-0 kubenswrapper[16352]: I0307 21:18:31.304878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:31.305453 master-0 kubenswrapper[16352]: E0307 21:18:31.305260 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:35.305217092 +0000 UTC m=+38.375922151 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:31.751297 master-0 kubenswrapper[16352]: I0307 21:18:31.751214 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q"] Mar 07 21:18:31.752654 master-0 kubenswrapper[16352]: I0307 21:18:31.752321 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.756298 master-0 kubenswrapper[16352]: I0307 21:18:31.756253 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 07 21:18:31.756609 master-0 kubenswrapper[16352]: I0307 21:18:31.756575 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 07 21:18:31.760622 master-0 kubenswrapper[16352]: I0307 21:18:31.760568 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3j1qmkmjalrq1" Mar 07 21:18:31.761465 master-0 kubenswrapper[16352]: I0307 21:18:31.761444 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 07 21:18:31.761727 master-0 kubenswrapper[16352]: I0307 21:18:31.761704 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 07 21:18:31.761918 master-0 kubenswrapper[16352]: I0307 21:18:31.761866 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-wbb4k" Mar 07 21:18:31.764025 master-0 kubenswrapper[16352]: I0307 21:18:31.763980 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q"] Mar 07 21:18:31.916852 master-0 kubenswrapper[16352]: I0307 21:18:31.916746 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-metrics-server-audit-profiles\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917080 master-0 kubenswrapper[16352]: I0307 21:18:31.916882 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fwwxm\" (UniqueName: \"kubernetes.io/projected/f458d2b8-7820-4db7-9837-bd184760ed36-kube-api-access-fwwxm\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917080 master-0 kubenswrapper[16352]: I0307 21:18:31.916915 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-server-tls\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917080 master-0 kubenswrapper[16352]: I0307 21:18:31.916988 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-client-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917080 master-0 kubenswrapper[16352]: I0307 21:18:31.917013 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f458d2b8-7820-4db7-9837-bd184760ed36-audit-log\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917080 master-0 kubenswrapper[16352]: I0307 21:18:31.917066 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.917451 master-0 kubenswrapper[16352]: I0307 21:18:31.917107 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-client-certs\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:31.979390 master-0 kubenswrapper[16352]: I0307 21:18:31.965767 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2"] Mar 07 21:18:31.979390 master-0 kubenswrapper[16352]: I0307 21:18:31.967025 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:31.979390 master-0 kubenswrapper[16352]: I0307 21:18:31.977237 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 07 21:18:31.979390 master-0 kubenswrapper[16352]: I0307 21:18:31.977527 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-q695j" Mar 07 21:18:31.993604 master-0 kubenswrapper[16352]: I0307 21:18:31.993538 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2"] Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.030631 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f458d2b8-7820-4db7-9837-bd184760ed36-audit-log\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.030912 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.031118 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-client-certs\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.031201 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-metrics-server-audit-profiles\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.031383 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fwwxm\" (UniqueName: \"kubernetes.io/projected/f458d2b8-7820-4db7-9837-bd184760ed36-kube-api-access-fwwxm\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.031456 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-server-tls\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.031486 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-client-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.032803 master-0 kubenswrapper[16352]: I0307 21:18:32.032519 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/f458d2b8-7820-4db7-9837-bd184760ed36-audit-log\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.055724 master-0 kubenswrapper[16352]: I0307 21:18:32.042515 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.068805 master-0 kubenswrapper[16352]: I0307 21:18:32.066570 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-client-certs\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.068805 master-0 kubenswrapper[16352]: I0307 21:18:32.066958 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fwwxm\" (UniqueName: \"kubernetes.io/projected/f458d2b8-7820-4db7-9837-bd184760ed36-kube-api-access-fwwxm\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.068805 master-0 kubenswrapper[16352]: I0307 21:18:32.068062 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/f458d2b8-7820-4db7-9837-bd184760ed36-metrics-server-audit-profiles\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.068805 master-0 kubenswrapper[16352]: I0307 21:18:32.068644 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-secret-metrics-server-tls\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.070700 master-0 kubenswrapper[16352]: I0307 21:18:32.070627 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f458d2b8-7820-4db7-9837-bd184760ed36-client-ca-bundle\") pod \"metrics-server-6fdfc4cfb9-d2n6q\" (UID: \"f458d2b8-7820-4db7-9837-bd184760ed36\") " pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.095076 master-0 kubenswrapper[16352]: I0307 21:18:32.095006 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:32.133197 master-0 kubenswrapper[16352]: I0307 21:18:32.133130 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8008fdc-cf66-4eef-b062-c339ddbb71a4-monitoring-plugin-cert\") pod \"monitoring-plugin-6bc88968b6-frbh2\" (UID: \"c8008fdc-cf66-4eef-b062-c339ddbb71a4\") " pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:32.235602 master-0 kubenswrapper[16352]: I0307 21:18:32.235524 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8008fdc-cf66-4eef-b062-c339ddbb71a4-monitoring-plugin-cert\") pod \"monitoring-plugin-6bc88968b6-frbh2\" (UID: \"c8008fdc-cf66-4eef-b062-c339ddbb71a4\") " pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:32.241121 master-0 kubenswrapper[16352]: I0307 21:18:32.240636 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c8008fdc-cf66-4eef-b062-c339ddbb71a4-monitoring-plugin-cert\") pod \"monitoring-plugin-6bc88968b6-frbh2\" (UID: \"c8008fdc-cf66-4eef-b062-c339ddbb71a4\") " pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:32.407424 master-0 kubenswrapper[16352]: I0307 21:18:32.407367 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:32.623443 master-0 kubenswrapper[16352]: I0307 21:18:32.620899 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:18:32.635588 master-0 kubenswrapper[16352]: I0307 21:18:32.635035 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.641262 master-0 kubenswrapper[16352]: I0307 21:18:32.639089 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-2ibssd7q5gl39" Mar 07 21:18:32.641756 master-0 kubenswrapper[16352]: I0307 21:18:32.641548 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 07 21:18:32.645915 master-0 kubenswrapper[16352]: I0307 21:18:32.644863 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 07 21:18:32.645915 master-0 kubenswrapper[16352]: I0307 21:18:32.645356 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-l4g8x" Mar 07 21:18:32.645915 master-0 kubenswrapper[16352]: I0307 21:18:32.645630 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 07 21:18:32.645915 master-0 kubenswrapper[16352]: I0307 21:18:32.645668 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 07 21:18:32.653788 master-0 kubenswrapper[16352]: I0307 21:18:32.649425 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 07 21:18:32.653788 master-0 kubenswrapper[16352]: I0307 21:18:32.649576 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 07 21:18:32.655578 master-0 kubenswrapper[16352]: I0307 21:18:32.654341 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:18:32.655578 master-0 kubenswrapper[16352]: I0307 21:18:32.654796 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 07 21:18:32.655578 master-0 kubenswrapper[16352]: I0307 21:18:32.655220 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 07 21:18:32.658954 master-0 kubenswrapper[16352]: I0307 21:18:32.658345 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 07 21:18:32.661847 master-0 kubenswrapper[16352]: I0307 21:18:32.660110 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 07 21:18:32.661847 master-0 kubenswrapper[16352]: I0307 21:18:32.661215 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 07 21:18:32.764200 master-0 kubenswrapper[16352]: I0307 21:18:32.764150 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764319 master-0 kubenswrapper[16352]: I0307 21:18:32.764211 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764319 master-0 kubenswrapper[16352]: I0307 21:18:32.764254 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764319 master-0 kubenswrapper[16352]: I0307 21:18:32.764291 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764436 master-0 kubenswrapper[16352]: I0307 21:18:32.764407 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764476 master-0 kubenswrapper[16352]: I0307 21:18:32.764433 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764519 master-0 kubenswrapper[16352]: I0307 21:18:32.764494 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bcp\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764554 master-0 kubenswrapper[16352]: I0307 21:18:32.764545 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764635 master-0 kubenswrapper[16352]: I0307 21:18:32.764590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764778 master-0 kubenswrapper[16352]: I0307 21:18:32.764672 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764897 master-0 kubenswrapper[16352]: I0307 21:18:32.764849 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764897 master-0 kubenswrapper[16352]: I0307 21:18:32.764890 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.764964 master-0 kubenswrapper[16352]: I0307 21:18:32.764935 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.765024 master-0 kubenswrapper[16352]: I0307 21:18:32.764999 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.765077 master-0 kubenswrapper[16352]: I0307 21:18:32.765055 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.767894 master-0 kubenswrapper[16352]: I0307 21:18:32.765211 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.767894 master-0 kubenswrapper[16352]: I0307 21:18:32.765315 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.767894 master-0 kubenswrapper[16352]: I0307 21:18:32.765337 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.871992 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872088 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872115 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872272 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872309 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872349 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873230 master-0 kubenswrapper[16352]: I0307 21:18:32.872405 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873251 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873336 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873395 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873413 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873452 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873475 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873494 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873519 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873576 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873597 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.873881 master-0 kubenswrapper[16352]: I0307 21:18:32.873647 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bcp\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.874486 master-0 kubenswrapper[16352]: I0307 21:18:32.874444 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.876291 master-0 kubenswrapper[16352]: I0307 21:18:32.876206 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.876397 master-0 kubenswrapper[16352]: E0307 21:18:32.876356 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:33.376321576 +0000 UTC m=+36.447026635 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:32.876722 master-0 kubenswrapper[16352]: I0307 21:18:32.876599 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.877680 master-0 kubenswrapper[16352]: I0307 21:18:32.877375 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.877680 master-0 kubenswrapper[16352]: I0307 21:18:32.877583 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.878888 master-0 kubenswrapper[16352]: I0307 21:18:32.878845 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.879041 master-0 kubenswrapper[16352]: I0307 21:18:32.879006 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.879716 master-0 kubenswrapper[16352]: I0307 21:18:32.879622 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.880643 master-0 kubenswrapper[16352]: I0307 21:18:32.880611 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.885595 master-0 kubenswrapper[16352]: I0307 21:18:32.885564 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.885701 master-0 kubenswrapper[16352]: I0307 21:18:32.885563 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.885788 master-0 kubenswrapper[16352]: I0307 21:18:32.885748 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.885863 master-0 kubenswrapper[16352]: I0307 21:18:32.885819 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.885945 master-0 kubenswrapper[16352]: I0307 21:18:32.885916 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.888559 master-0 kubenswrapper[16352]: I0307 21:18:32.888484 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.892228 master-0 kubenswrapper[16352]: I0307 21:18:32.892181 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bcp\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:32.903420 master-0 kubenswrapper[16352]: I0307 21:18:32.903349 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:33.050140 master-0 kubenswrapper[16352]: I0307 21:18:33.050067 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2"] Mar 07 21:18:33.058422 master-0 kubenswrapper[16352]: W0307 21:18:33.058352 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc8008fdc_cf66_4eef_b062_c339ddbb71a4.slice/crio-1347c00377fc03a12a5adf06b771cb93aa2a6b7e4c68388cc95a59aaf4794e2b WatchSource:0}: Error finding container 1347c00377fc03a12a5adf06b771cb93aa2a6b7e4c68388cc95a59aaf4794e2b: Status 404 returned error can't find the container with id 1347c00377fc03a12a5adf06b771cb93aa2a6b7e4c68388cc95a59aaf4794e2b Mar 07 21:18:33.135105 master-0 kubenswrapper[16352]: I0307 21:18:33.135038 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q"] Mar 07 21:18:33.142356 master-0 kubenswrapper[16352]: W0307 21:18:33.142306 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf458d2b8_7820_4db7_9837_bd184760ed36.slice/crio-40a54ad9628a7eb89e68a2c04ab3a08d68cba62f39a31d128ddb53a5ef37c130 WatchSource:0}: Error finding container 40a54ad9628a7eb89e68a2c04ab3a08d68cba62f39a31d128ddb53a5ef37c130: Status 404 returned error can't find the container with id 40a54ad9628a7eb89e68a2c04ab3a08d68cba62f39a31d128ddb53a5ef37c130 Mar 07 21:18:33.389255 master-0 kubenswrapper[16352]: I0307 21:18:33.389182 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:33.389534 master-0 kubenswrapper[16352]: E0307 21:18:33.389441 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:34.389414201 +0000 UTC m=+37.460119260 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:33.587371 master-0 kubenswrapper[16352]: I0307 21:18:33.587272 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"67d4e2251c3d9b948605860b228312714813ad1436b40da74944275031d31eff"} Mar 07 21:18:33.587371 master-0 kubenswrapper[16352]: I0307 21:18:33.587349 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"e653fc843630756e66f8d29c2b9625ffa3ea04d32438e5a6a0be8d71c0359d30"} Mar 07 21:18:33.587371 master-0 kubenswrapper[16352]: I0307 21:18:33.587362 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"c65dce0d0a029139750c67afd482796e955ce0ef6e451cc328ca62dcd4f90524"} Mar 07 21:18:33.589103 master-0 kubenswrapper[16352]: I0307 21:18:33.589056 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" event={"ID":"f458d2b8-7820-4db7-9837-bd184760ed36","Type":"ContainerStarted","Data":"40a54ad9628a7eb89e68a2c04ab3a08d68cba62f39a31d128ddb53a5ef37c130"} Mar 07 21:18:33.590406 master-0 kubenswrapper[16352]: I0307 21:18:33.590368 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" event={"ID":"c8008fdc-cf66-4eef-b062-c339ddbb71a4","Type":"ContainerStarted","Data":"1347c00377fc03a12a5adf06b771cb93aa2a6b7e4c68388cc95a59aaf4794e2b"} Mar 07 21:18:34.409234 master-0 kubenswrapper[16352]: I0307 21:18:34.409157 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:34.409507 master-0 kubenswrapper[16352]: E0307 21:18:34.409394 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:36.409364632 +0000 UTC m=+39.480069691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:35.326494 master-0 kubenswrapper[16352]: I0307 21:18:35.326416 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:35.327127 master-0 kubenswrapper[16352]: E0307 21:18:35.326645 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:43.326615736 +0000 UTC m=+46.397320795 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:35.327127 master-0 kubenswrapper[16352]: I0307 21:18:35.326720 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:18:35.327127 master-0 kubenswrapper[16352]: E0307 21:18:35.326925 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:35.327127 master-0 kubenswrapper[16352]: E0307 21:18:35.326952 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:35.327127 master-0 kubenswrapper[16352]: E0307 21:18:35.327002 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:19:07.326986615 +0000 UTC m=+70.397691684 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:18:35.607541 master-0 kubenswrapper[16352]: I0307 21:18:35.607452 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" event={"ID":"c8008fdc-cf66-4eef-b062-c339ddbb71a4","Type":"ContainerStarted","Data":"d472a51c8088a058b456498d5cdfec376082b811cdf0d270aa2d5f36664a4f5c"} Mar 07 21:18:35.607869 master-0 kubenswrapper[16352]: I0307 21:18:35.607802 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:35.614425 master-0 kubenswrapper[16352]: I0307 21:18:35.614112 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"9ceb880f993d3b23551b1b19e1081da3981d2ebdc9ff765bd8d6d26c1be86262"} Mar 07 21:18:35.614425 master-0 kubenswrapper[16352]: I0307 21:18:35.614176 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"8d88cb1e306383af56b8627cb2d3beeace65e52d466c16831010d0e4e9435b10"} Mar 07 21:18:35.614425 master-0 kubenswrapper[16352]: I0307 21:18:35.614196 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" event={"ID":"02f2abd4-a05b-4063-9e15-d66d5813dd5e","Type":"ContainerStarted","Data":"5f7851468cbcc98f3f0c43be02b4cfdd38d8306f381130ea14826caf1fb7112c"} Mar 07 21:18:35.614425 master-0 kubenswrapper[16352]: I0307 21:18:35.614370 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:35.616301 master-0 kubenswrapper[16352]: I0307 21:18:35.616165 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" event={"ID":"f458d2b8-7820-4db7-9837-bd184760ed36","Type":"ContainerStarted","Data":"bbc2758c66a9ec6eb9570cbda0a203723012ee41f6c736e1a44930cced2f1d0b"} Mar 07 21:18:35.617146 master-0 kubenswrapper[16352]: I0307 21:18:35.617088 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" Mar 07 21:18:35.629901 master-0 kubenswrapper[16352]: I0307 21:18:35.629800 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-6bc88968b6-frbh2" podStartSLOduration=2.7525316699999998 podStartE2EDuration="4.629781408s" podCreationTimestamp="2026-03-07 21:18:31 +0000 UTC" firstStartedPulling="2026-03-07 21:18:33.061774439 +0000 UTC m=+36.132479508" lastFinishedPulling="2026-03-07 21:18:34.939024187 +0000 UTC m=+38.009729246" observedRunningTime="2026-03-07 21:18:35.628600989 +0000 UTC m=+38.699306048" watchObservedRunningTime="2026-03-07 21:18:35.629781408 +0000 UTC m=+38.700486467" Mar 07 21:18:35.652650 master-0 kubenswrapper[16352]: I0307 21:18:35.652578 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" podStartSLOduration=2.84542648 podStartE2EDuration="4.652551057s" podCreationTimestamp="2026-03-07 21:18:31 +0000 UTC" firstStartedPulling="2026-03-07 21:18:33.146196765 +0000 UTC m=+36.216901834" lastFinishedPulling="2026-03-07 21:18:34.953321352 +0000 UTC m=+38.024026411" observedRunningTime="2026-03-07 21:18:35.647169478 +0000 UTC m=+38.717874617" watchObservedRunningTime="2026-03-07 21:18:35.652551057 +0000 UTC m=+38.723256116" Mar 07 21:18:35.683246 master-0 kubenswrapper[16352]: I0307 21:18:35.683104 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" podStartSLOduration=2.88557474 podStartE2EDuration="7.683076963s" podCreationTimestamp="2026-03-07 21:18:28 +0000 UTC" firstStartedPulling="2026-03-07 21:18:30.135213572 +0000 UTC m=+33.205918631" lastFinishedPulling="2026-03-07 21:18:34.932715795 +0000 UTC m=+38.003420854" observedRunningTime="2026-03-07 21:18:35.67961935 +0000 UTC m=+38.750324429" watchObservedRunningTime="2026-03-07 21:18:35.683076963 +0000 UTC m=+38.753782022" Mar 07 21:18:36.449509 master-0 kubenswrapper[16352]: I0307 21:18:36.449401 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:36.450396 master-0 kubenswrapper[16352]: E0307 21:18:36.449724 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:40.449655053 +0000 UTC m=+43.520360282 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:38.748669 master-0 kubenswrapper[16352]: I0307 21:18:38.748551 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-9995cd46f-q546g" Mar 07 21:18:39.107196 master-0 kubenswrapper[16352]: I0307 21:18:39.107084 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:18:39.107610 master-0 kubenswrapper[16352]: E0307 21:18:39.107468 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:19:11.107427697 +0000 UTC m=+74.178132786 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:40.535078 master-0 kubenswrapper[16352]: I0307 21:18:40.534935 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:40.536159 master-0 kubenswrapper[16352]: E0307 21:18:40.535253 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:48.535213225 +0000 UTC m=+51.605918314 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:43.391825 master-0 kubenswrapper[16352]: I0307 21:18:43.391725 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:43.393086 master-0 kubenswrapper[16352]: E0307 21:18:43.392063 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:18:59.392011599 +0000 UTC m=+62.462716818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:48.590208 master-0 kubenswrapper[16352]: I0307 21:18:48.590085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:18:48.591482 master-0 kubenswrapper[16352]: E0307 21:18:48.590343 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:19:04.590323199 +0000 UTC m=+67.661028268 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:18:52.365957 master-0 kubenswrapper[16352]: I0307 21:18:52.365888 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:52.366748 master-0 kubenswrapper[16352]: I0307 21:18:52.366700 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:18:59.401803 master-0 kubenswrapper[16352]: I0307 21:18:59.401549 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:18:59.402554 master-0 kubenswrapper[16352]: E0307 21:18:59.401825 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:19:31.401802746 +0000 UTC m=+94.472507815 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:19:04.645090 master-0 kubenswrapper[16352]: I0307 21:19:04.644986 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:19:04.646517 master-0 kubenswrapper[16352]: E0307 21:19:04.645360 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:19:36.645292564 +0000 UTC m=+99.715997663 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:19:07.398801 master-0 kubenswrapper[16352]: I0307 21:19:07.398576 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:19:07.399766 master-0 kubenswrapper[16352]: E0307 21:19:07.399002 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:19:07.399766 master-0 kubenswrapper[16352]: E0307 21:19:07.399092 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:19:07.399766 master-0 kubenswrapper[16352]: E0307 21:19:07.399228 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:20:11.399192807 +0000 UTC m=+134.469897896 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:19:11.182323 master-0 kubenswrapper[16352]: I0307 21:19:11.182228 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:19:11.183358 master-0 kubenswrapper[16352]: E0307 21:19:11.182491 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:20:15.182452248 +0000 UTC m=+138.253157347 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:19:12.105383 master-0 kubenswrapper[16352]: I0307 21:19:12.105290 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:19:12.110977 master-0 kubenswrapper[16352]: I0307 21:19:12.110904 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-6fdfc4cfb9-d2n6q" Mar 07 21:19:31.429352 master-0 kubenswrapper[16352]: I0307 21:19:31.429089 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:19:31.430157 master-0 kubenswrapper[16352]: E0307 21:19:31.429506 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:20:35.429471609 +0000 UTC m=+158.500176698 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:19:36.652280 master-0 kubenswrapper[16352]: I0307 21:19:36.652187 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:19:36.653377 master-0 kubenswrapper[16352]: E0307 21:19:36.652450 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:20:40.652412143 +0000 UTC m=+163.723117242 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:19:40.496984 master-0 kubenswrapper[16352]: I0307 21:19:40.496869 16352 patch_prober.go:28] interesting pod/machine-config-daemon-kp74q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 21:19:40.497843 master-0 kubenswrapper[16352]: I0307 21:19:40.496980 16352 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" podUID="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 21:19:57.231770 master-0 kubenswrapper[16352]: I0307 21:19:57.231658 16352 scope.go:117] "RemoveContainer" containerID="485cabca7a9edbb9a83d8ef9ee43891f8c296cb8958998f7a4fa97d4fc8e25c3" Mar 07 21:20:10.180284 master-0 kubenswrapper[16352]: E0307 21:20:10.180162 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" podUID="28fd6ebb-51ac-4763-99b2-3a94b124d059" Mar 07 21:20:10.497441 master-0 kubenswrapper[16352]: I0307 21:20:10.497199 16352 patch_prober.go:28] interesting pod/machine-config-daemon-kp74q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 21:20:10.497441 master-0 kubenswrapper[16352]: I0307 21:20:10.497324 16352 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" podUID="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 21:20:10.606271 master-0 kubenswrapper[16352]: I0307 21:20:10.606193 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:20:11.438139 master-0 kubenswrapper[16352]: I0307 21:20:11.438006 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:20:11.439161 master-0 kubenswrapper[16352]: E0307 21:20:11.438255 16352 projected.go:288] Couldn't get configMap openshift-kube-apiserver/kube-root-ca.crt: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:20:11.439161 master-0 kubenswrapper[16352]: E0307 21:20:11.438317 16352 projected.go:194] Error preparing data for projected volume kube-api-access for pod openshift-kube-apiserver/installer-1-retry-1-master-0: object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:20:11.439161 master-0 kubenswrapper[16352]: E0307 21:20:11.438417 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access podName:2357c135-5d09-4657-9038-48d25ed55b2d nodeName:}" failed. No retries permitted until 2026-03-07 21:22:13.438385053 +0000 UTC m=+256.509090142 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access" (UniqueName: "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access") pod "installer-1-retry-1-master-0" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d") : object "openshift-kube-apiserver"/"kube-root-ca.crt" not registered Mar 07 21:20:15.207383 master-0 kubenswrapper[16352]: I0307 21:20:15.207221 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:20:15.208901 master-0 kubenswrapper[16352]: E0307 21:20:15.207567 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca podName:28fd6ebb-51ac-4763-99b2-3a94b124d059 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:17.207517362 +0000 UTC m=+260.278222461 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "trusted-ca" (UniqueName: "kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca") pod "console-operator-6c7fb6b958-2grlf" (UID: "28fd6ebb-51ac-4763-99b2-3a94b124d059") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:20:30.540177 master-0 kubenswrapper[16352]: E0307 21:20:30.539982 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" Mar 07 21:20:30.785094 master-0 kubenswrapper[16352]: I0307 21:20:30.784982 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:20:35.505160 master-0 kubenswrapper[16352]: I0307 21:20:35.505062 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:20:35.507292 master-0 kubenswrapper[16352]: E0307 21:20:35.505298 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle podName:65a24af7-ab85-4c88-ab84-c98d1b4efa88 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:37.505275142 +0000 UTC m=+280.575980191 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:20:35.733842 master-0 kubenswrapper[16352]: E0307 21:20:35.733734 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" Mar 07 21:20:35.828083 master-0 kubenswrapper[16352]: I0307 21:20:35.827895 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:20:39.130529 master-0 kubenswrapper[16352]: I0307 21:20:39.130435 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-tztzb"] Mar 07 21:20:39.140179 master-0 kubenswrapper[16352]: I0307 21:20:39.140100 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.145873 master-0 kubenswrapper[16352]: I0307 21:20:39.145531 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-mnvfv" Mar 07 21:20:39.160801 master-0 kubenswrapper[16352]: I0307 21:20:39.155834 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 21:20:39.272922 master-0 kubenswrapper[16352]: I0307 21:20:39.272796 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6819f4e-46ab-4cc2-ae75-1795f0489a39-host\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.273165 master-0 kubenswrapper[16352]: I0307 21:20:39.273061 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6819f4e-46ab-4cc2-ae75-1795f0489a39-serviceca\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.273228 master-0 kubenswrapper[16352]: I0307 21:20:39.273195 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2q657\" (UniqueName: \"kubernetes.io/projected/d6819f4e-46ab-4cc2-ae75-1795f0489a39-kube-api-access-2q657\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.375595 master-0 kubenswrapper[16352]: I0307 21:20:39.375488 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2q657\" (UniqueName: \"kubernetes.io/projected/d6819f4e-46ab-4cc2-ae75-1795f0489a39-kube-api-access-2q657\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.375987 master-0 kubenswrapper[16352]: I0307 21:20:39.375758 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6819f4e-46ab-4cc2-ae75-1795f0489a39-host\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.375987 master-0 kubenswrapper[16352]: I0307 21:20:39.375844 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6819f4e-46ab-4cc2-ae75-1795f0489a39-serviceca\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.376179 master-0 kubenswrapper[16352]: I0307 21:20:39.375949 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/d6819f4e-46ab-4cc2-ae75-1795f0489a39-host\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.376950 master-0 kubenswrapper[16352]: I0307 21:20:39.376895 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/d6819f4e-46ab-4cc2-ae75-1795f0489a39-serviceca\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.397898 master-0 kubenswrapper[16352]: I0307 21:20:39.397802 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2q657\" (UniqueName: \"kubernetes.io/projected/d6819f4e-46ab-4cc2-ae75-1795f0489a39-kube-api-access-2q657\") pod \"node-ca-tztzb\" (UID: \"d6819f4e-46ab-4cc2-ae75-1795f0489a39\") " pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.493388 master-0 kubenswrapper[16352]: I0307 21:20:39.493281 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-tztzb" Mar 07 21:20:39.527071 master-0 kubenswrapper[16352]: W0307 21:20:39.526994 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6819f4e_46ab_4cc2_ae75_1795f0489a39.slice/crio-5d05f43c176a333381d5c1a6eb1f722fe898cbc209408289983c90108f19094e WatchSource:0}: Error finding container 5d05f43c176a333381d5c1a6eb1f722fe898cbc209408289983c90108f19094e: Status 404 returned error can't find the container with id 5d05f43c176a333381d5c1a6eb1f722fe898cbc209408289983c90108f19094e Mar 07 21:20:39.865548 master-0 kubenswrapper[16352]: I0307 21:20:39.865416 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tztzb" event={"ID":"d6819f4e-46ab-4cc2-ae75-1795f0489a39","Type":"ContainerStarted","Data":"5d05f43c176a333381d5c1a6eb1f722fe898cbc209408289983c90108f19094e"} Mar 07 21:20:40.497729 master-0 kubenswrapper[16352]: I0307 21:20:40.497604 16352 patch_prober.go:28] interesting pod/machine-config-daemon-kp74q container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 07 21:20:40.498349 master-0 kubenswrapper[16352]: I0307 21:20:40.497773 16352 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" podUID="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 07 21:20:40.498349 master-0 kubenswrapper[16352]: I0307 21:20:40.497853 16352 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" Mar 07 21:20:40.500006 master-0 kubenswrapper[16352]: I0307 21:20:40.499870 16352 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9c35d34b990f0f0f014bf31acc6957c6a30b6adb77b3eb6ea46594257c1430b0"} pod="openshift-machine-config-operator/machine-config-daemon-kp74q" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 07 21:20:40.500127 master-0 kubenswrapper[16352]: I0307 21:20:40.500074 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" podUID="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" containerName="machine-config-daemon" containerID="cri-o://9c35d34b990f0f0f014bf31acc6957c6a30b6adb77b3eb6ea46594257c1430b0" gracePeriod=600 Mar 07 21:20:40.699376 master-0 kubenswrapper[16352]: I0307 21:20:40.699277 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:20:40.702309 master-0 kubenswrapper[16352]: E0307 21:20:40.699578 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle podName:c7ca1461-37ed-4e6b-a289-9f3249d52a24 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:42.699556357 +0000 UTC m=+285.770261426 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "prometheus-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle") pod "prometheus-k8s-0" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24") : configmap references non-existent config key: ca-bundle.crt Mar 07 21:20:40.877783 master-0 kubenswrapper[16352]: I0307 21:20:40.877658 16352 generic.go:334] "Generic (PLEG): container finished" podID="655b9f0a-cf27-443d-b0ea-3642dcae1ad2" containerID="9c35d34b990f0f0f014bf31acc6957c6a30b6adb77b3eb6ea46594257c1430b0" exitCode=0 Mar 07 21:20:40.878005 master-0 kubenswrapper[16352]: I0307 21:20:40.877783 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" event={"ID":"655b9f0a-cf27-443d-b0ea-3642dcae1ad2","Type":"ContainerDied","Data":"9c35d34b990f0f0f014bf31acc6957c6a30b6adb77b3eb6ea46594257c1430b0"} Mar 07 21:20:40.878005 master-0 kubenswrapper[16352]: I0307 21:20:40.877903 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kp74q" event={"ID":"655b9f0a-cf27-443d-b0ea-3642dcae1ad2","Type":"ContainerStarted","Data":"f61dbb9d20d17261fde90563a4881b28fba511333bd3c89c2cc2c8e3e9f58656"} Mar 07 21:20:42.897640 master-0 kubenswrapper[16352]: I0307 21:20:42.897531 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-tztzb" event={"ID":"d6819f4e-46ab-4cc2-ae75-1795f0489a39","Type":"ContainerStarted","Data":"87e479d6343dc0592f30adb924d84507d9c5fb77044b6be32d5a757aa97d441a"} Mar 07 21:20:42.916180 master-0 kubenswrapper[16352]: I0307 21:20:42.916049 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-tztzb" podStartSLOduration=1.833557979 podStartE2EDuration="3.916016786s" podCreationTimestamp="2026-03-07 21:20:39 +0000 UTC" firstStartedPulling="2026-03-07 21:20:39.53049791 +0000 UTC m=+162.601203009" lastFinishedPulling="2026-03-07 21:20:41.612956757 +0000 UTC m=+164.683661816" observedRunningTime="2026-03-07 21:20:42.914404297 +0000 UTC m=+165.985109356" watchObservedRunningTime="2026-03-07 21:20:42.916016786 +0000 UTC m=+165.986721885" Mar 07 21:21:47.755383 master-0 kubenswrapper[16352]: I0307 21:21:47.754400 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 07 21:21:47.756648 master-0 kubenswrapper[16352]: I0307 21:21:47.755533 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.759149 master-0 kubenswrapper[16352]: I0307 21:21:47.759088 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 21:21:47.759322 master-0 kubenswrapper[16352]: I0307 21:21:47.759143 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-xlnsg" Mar 07 21:21:47.777073 master-0 kubenswrapper[16352]: I0307 21:21:47.776996 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 07 21:21:47.805534 master-0 kubenswrapper[16352]: I0307 21:21:47.805431 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.805984 master-0 kubenswrapper[16352]: I0307 21:21:47.805595 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.805984 master-0 kubenswrapper[16352]: I0307 21:21:47.805647 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.907662 master-0 kubenswrapper[16352]: I0307 21:21:47.907588 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.907959 master-0 kubenswrapper[16352]: I0307 21:21:47.907736 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.907959 master-0 kubenswrapper[16352]: I0307 21:21:47.907758 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.907959 master-0 kubenswrapper[16352]: I0307 21:21:47.907847 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.910873 master-0 kubenswrapper[16352]: I0307 21:21:47.908229 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:47.938007 master-0 kubenswrapper[16352]: I0307 21:21:47.937950 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access\") pod \"installer-3-master-0\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:48.029126 master-0 kubenswrapper[16352]: I0307 21:21:48.028932 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 07 21:21:48.030481 master-0 kubenswrapper[16352]: I0307 21:21:48.030440 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.033185 master-0 kubenswrapper[16352]: I0307 21:21:48.033140 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-8cblb" Mar 07 21:21:48.033333 master-0 kubenswrapper[16352]: I0307 21:21:48.033300 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 07 21:21:48.042272 master-0 kubenswrapper[16352]: I0307 21:21:48.042221 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 07 21:21:48.105913 master-0 kubenswrapper[16352]: I0307 21:21:48.105854 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:21:48.112874 master-0 kubenswrapper[16352]: I0307 21:21:48.112773 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.113259 master-0 kubenswrapper[16352]: I0307 21:21:48.113207 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.113331 master-0 kubenswrapper[16352]: I0307 21:21:48.113302 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.137167 master-0 kubenswrapper[16352]: I0307 21:21:48.137088 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-2hhhs"] Mar 07 21:21:48.138291 master-0 kubenswrapper[16352]: I0307 21:21:48.138250 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.140372 master-0 kubenswrapper[16352]: I0307 21:21:48.140336 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-zvfzk" Mar 07 21:21:48.145352 master-0 kubenswrapper[16352]: I0307 21:21:48.145278 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 07 21:21:48.215717 master-0 kubenswrapper[16352]: I0307 21:21:48.215519 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.215717 master-0 kubenswrapper[16352]: I0307 21:21:48.215602 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wrr2\" (UniqueName: \"kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.215717 master-0 kubenswrapper[16352]: I0307 21:21:48.215712 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.216014 master-0 kubenswrapper[16352]: I0307 21:21:48.215971 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.216014 master-0 kubenswrapper[16352]: I0307 21:21:48.216002 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.216314 master-0 kubenswrapper[16352]: I0307 21:21:48.216073 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.216314 master-0 kubenswrapper[16352]: I0307 21:21:48.216122 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.216314 master-0 kubenswrapper[16352]: I0307 21:21:48.216149 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.216314 master-0 kubenswrapper[16352]: I0307 21:21:48.216164 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.236367 master-0 kubenswrapper[16352]: I0307 21:21:48.236340 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.317896 master-0 kubenswrapper[16352]: I0307 21:21:48.317839 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.318115 master-0 kubenswrapper[16352]: I0307 21:21:48.317925 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.318115 master-0 kubenswrapper[16352]: I0307 21:21:48.317962 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.318115 master-0 kubenswrapper[16352]: I0307 21:21:48.317995 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8wrr2\" (UniqueName: \"kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.320151 master-0 kubenswrapper[16352]: I0307 21:21:48.319290 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.320151 master-0 kubenswrapper[16352]: I0307 21:21:48.319379 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.320151 master-0 kubenswrapper[16352]: I0307 21:21:48.319651 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.336774 master-0 kubenswrapper[16352]: I0307 21:21:48.336164 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wrr2\" (UniqueName: \"kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2\") pod \"cni-sysctl-allowlist-ds-2hhhs\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.374788 master-0 kubenswrapper[16352]: I0307 21:21:48.374725 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:21:48.505710 master-0 kubenswrapper[16352]: I0307 21:21:48.505457 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:48.541126 master-0 kubenswrapper[16352]: W0307 21:21:48.541053 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-3423d5650e5761050cb216135ebdea581e304f1e4fda7893df4d396f15fc6692 WatchSource:0}: Error finding container 3423d5650e5761050cb216135ebdea581e304f1e4fda7893df4d396f15fc6692: Status 404 returned error can't find the container with id 3423d5650e5761050cb216135ebdea581e304f1e4fda7893df4d396f15fc6692 Mar 07 21:21:48.619006 master-0 kubenswrapper[16352]: I0307 21:21:48.618256 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-3-master-0"] Mar 07 21:21:48.878016 master-0 kubenswrapper[16352]: I0307 21:21:48.877937 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-4-retry-1-master-0"] Mar 07 21:21:48.886415 master-0 kubenswrapper[16352]: W0307 21:21:48.886350 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53 WatchSource:0}: Error finding container 2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53: Status 404 returned error can't find the container with id 2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53 Mar 07 21:21:49.543946 master-0 kubenswrapper[16352]: I0307 21:21:49.543713 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" event={"ID":"f05b2327-b1ca-4b9b-a167-68f9fcb506e6","Type":"ContainerStarted","Data":"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998"} Mar 07 21:21:49.543946 master-0 kubenswrapper[16352]: I0307 21:21:49.543834 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" event={"ID":"f05b2327-b1ca-4b9b-a167-68f9fcb506e6","Type":"ContainerStarted","Data":"3423d5650e5761050cb216135ebdea581e304f1e4fda7893df4d396f15fc6692"} Mar 07 21:21:49.544335 master-0 kubenswrapper[16352]: I0307 21:21:49.544276 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:49.550041 master-0 kubenswrapper[16352]: I0307 21:21:49.549971 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"2aadcdd5-aa40-442c-9434-97f150dddf70","Type":"ContainerStarted","Data":"df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f"} Mar 07 21:21:49.550113 master-0 kubenswrapper[16352]: I0307 21:21:49.550044 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"2aadcdd5-aa40-442c-9434-97f150dddf70","Type":"ContainerStarted","Data":"2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53"} Mar 07 21:21:49.554529 master-0 kubenswrapper[16352]: I0307 21:21:49.554476 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"2200306f-7816-4019-a6e1-5847ea5b51b1","Type":"ContainerStarted","Data":"ef97b357626aa1231949078ed219eedc8490c7ce007443d958150cfdc31df36f"} Mar 07 21:21:49.554590 master-0 kubenswrapper[16352]: I0307 21:21:49.554533 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"2200306f-7816-4019-a6e1-5847ea5b51b1","Type":"ContainerStarted","Data":"1b0ee44965e1b527225d1cfb842b0cc111aef93935c0770d430b0d28fc3b7411"} Mar 07 21:21:49.585120 master-0 kubenswrapper[16352]: I0307 21:21:49.585006 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" podStartSLOduration=1.584974721 podStartE2EDuration="1.584974721s" podCreationTimestamp="2026-03-07 21:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:21:49.568291319 +0000 UTC m=+232.638996418" watchObservedRunningTime="2026-03-07 21:21:49.584974721 +0000 UTC m=+232.655679810" Mar 07 21:21:49.587648 master-0 kubenswrapper[16352]: I0307 21:21:49.587558 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:21:49.628623 master-0 kubenswrapper[16352]: I0307 21:21:49.628496 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-3-master-0" podStartSLOduration=2.62845723 podStartE2EDuration="2.62845723s" podCreationTimestamp="2026-03-07 21:21:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:21:49.597053252 +0000 UTC m=+232.667758351" watchObservedRunningTime="2026-03-07 21:21:49.62845723 +0000 UTC m=+232.699162329" Mar 07 21:21:49.650134 master-0 kubenswrapper[16352]: I0307 21:21:49.649286 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" podStartSLOduration=1.649258212 podStartE2EDuration="1.649258212s" podCreationTimestamp="2026-03-07 21:21:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:21:49.624562486 +0000 UTC m=+232.695267575" watchObservedRunningTime="2026-03-07 21:21:49.649258212 +0000 UTC m=+232.719963271" Mar 07 21:21:50.139301 master-0 kubenswrapper[16352]: I0307 21:21:50.139190 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-2hhhs"] Mar 07 21:21:51.507552 master-0 kubenswrapper[16352]: I0307 21:21:51.507477 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-69ccf66766-q79sx"] Mar 07 21:21:51.508897 master-0 kubenswrapper[16352]: I0307 21:21:51.508865 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.513058 master-0 kubenswrapper[16352]: I0307 21:21:51.512999 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 07 21:21:51.513903 master-0 kubenswrapper[16352]: I0307 21:21:51.513853 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 07 21:21:51.514113 master-0 kubenswrapper[16352]: I0307 21:21:51.514066 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 07 21:21:51.516524 master-0 kubenswrapper[16352]: I0307 21:21:51.516450 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 07 21:21:51.517226 master-0 kubenswrapper[16352]: I0307 21:21:51.517146 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-hmpjk" Mar 07 21:21:51.517852 master-0 kubenswrapper[16352]: I0307 21:21:51.517755 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 07 21:21:51.525927 master-0 kubenswrapper[16352]: I0307 21:21:51.525864 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69ccf66766-q79sx"] Mar 07 21:21:51.563164 master-0 kubenswrapper[16352]: I0307 21:21:51.527906 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 07 21:21:51.569703 master-0 kubenswrapper[16352]: I0307 21:21:51.569632 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" containerID="cri-o://f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" gracePeriod=30 Mar 07 21:21:51.610783 master-0 kubenswrapper[16352]: I0307 21:21:51.610700 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntncp\" (UniqueName: \"kubernetes.io/projected/835feeef-503f-4c9a-b5b0-bf99030ef0e0-kube-api-access-ntncp\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.611153 master-0 kubenswrapper[16352]: I0307 21:21:51.610797 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-federate-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.611153 master-0 kubenswrapper[16352]: I0307 21:21:51.610844 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.611153 master-0 kubenswrapper[16352]: I0307 21:21:51.610970 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.611324 master-0 kubenswrapper[16352]: I0307 21:21:51.611287 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-metrics-client-ca\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.611882 master-0 kubenswrapper[16352]: I0307 21:21:51.611638 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.612073 master-0 kubenswrapper[16352]: I0307 21:21:51.612009 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.612167 master-0 kubenswrapper[16352]: I0307 21:21:51.612142 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-serving-certs-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.715142 master-0 kubenswrapper[16352]: I0307 21:21:51.715054 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-serving-certs-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.715525 master-0 kubenswrapper[16352]: I0307 21:21:51.715432 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-federate-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.715664 master-0 kubenswrapper[16352]: I0307 21:21:51.715573 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntncp\" (UniqueName: \"kubernetes.io/projected/835feeef-503f-4c9a-b5b0-bf99030ef0e0-kube-api-access-ntncp\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.715901 master-0 kubenswrapper[16352]: I0307 21:21:51.715833 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.716112 master-0 kubenswrapper[16352]: I0307 21:21:51.716072 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.716340 master-0 kubenswrapper[16352]: I0307 21:21:51.716282 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-serving-certs-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.716375 master-0 kubenswrapper[16352]: I0307 21:21:51.716342 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-metrics-client-ca\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.716473 master-0 kubenswrapper[16352]: I0307 21:21:51.716444 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.716516 master-0 kubenswrapper[16352]: I0307 21:21:51.716503 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.717090 master-0 kubenswrapper[16352]: I0307 21:21:51.717049 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-metrics-client-ca\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.718177 master-0 kubenswrapper[16352]: I0307 21:21:51.718118 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-trusted-ca-bundle\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.720735 master-0 kubenswrapper[16352]: I0307 21:21:51.720675 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.720802 master-0 kubenswrapper[16352]: I0307 21:21:51.720675 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-secret-telemeter-client\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.721740 master-0 kubenswrapper[16352]: I0307 21:21:51.721674 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-federate-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.726346 master-0 kubenswrapper[16352]: I0307 21:21:51.726285 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/835feeef-503f-4c9a-b5b0-bf99030ef0e0-telemeter-client-tls\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.740038 master-0 kubenswrapper[16352]: I0307 21:21:51.739949 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntncp\" (UniqueName: \"kubernetes.io/projected/835feeef-503f-4c9a-b5b0-bf99030ef0e0-kube-api-access-ntncp\") pod \"telemeter-client-69ccf66766-q79sx\" (UID: \"835feeef-503f-4c9a-b5b0-bf99030ef0e0\") " pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:51.871921 master-0 kubenswrapper[16352]: I0307 21:21:51.871788 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" Mar 07 21:21:52.407587 master-0 kubenswrapper[16352]: I0307 21:21:52.407409 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-69ccf66766-q79sx"] Mar 07 21:21:52.414783 master-0 kubenswrapper[16352]: W0307 21:21:52.414721 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod835feeef_503f_4c9a_b5b0_bf99030ef0e0.slice/crio-cb2850b74a5a25b9fd365d4a0546f17a6a1f43f53ed68373fd02fc4d0ea67c29 WatchSource:0}: Error finding container cb2850b74a5a25b9fd365d4a0546f17a6a1f43f53ed68373fd02fc4d0ea67c29: Status 404 returned error can't find the container with id cb2850b74a5a25b9fd365d4a0546f17a6a1f43f53ed68373fd02fc4d0ea67c29 Mar 07 21:21:52.582043 master-0 kubenswrapper[16352]: I0307 21:21:52.581910 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" event={"ID":"835feeef-503f-4c9a-b5b0-bf99030ef0e0","Type":"ContainerStarted","Data":"cb2850b74a5a25b9fd365d4a0546f17a6a1f43f53ed68373fd02fc4d0ea67c29"} Mar 07 21:21:55.628767 master-0 kubenswrapper[16352]: I0307 21:21:55.628562 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" event={"ID":"835feeef-503f-4c9a-b5b0-bf99030ef0e0","Type":"ContainerStarted","Data":"8db772292a291c958dd2372e88dde95641edd3ff450af81d7d1c3e6af2fd5c13"} Mar 07 21:21:56.643740 master-0 kubenswrapper[16352]: I0307 21:21:56.643598 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" event={"ID":"835feeef-503f-4c9a-b5b0-bf99030ef0e0","Type":"ContainerStarted","Data":"98202efb015ce7ba68f89637ed132d1513ed076929588cfb40a9970a50012ee2"} Mar 07 21:21:56.644307 master-0 kubenswrapper[16352]: I0307 21:21:56.643761 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" event={"ID":"835feeef-503f-4c9a-b5b0-bf99030ef0e0","Type":"ContainerStarted","Data":"bdeb35e2dee51efb3ba0dc39bc4466c25d2d81a14415442a86a31f09ac5355e6"} Mar 07 21:21:56.688117 master-0 kubenswrapper[16352]: I0307 21:21:56.687923 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-69ccf66766-q79sx" podStartSLOduration=2.048201631 podStartE2EDuration="5.687467238s" podCreationTimestamp="2026-03-07 21:21:51 +0000 UTC" firstStartedPulling="2026-03-07 21:21:52.41875581 +0000 UTC m=+235.489460899" lastFinishedPulling="2026-03-07 21:21:56.058021447 +0000 UTC m=+239.128726506" observedRunningTime="2026-03-07 21:21:56.680232994 +0000 UTC m=+239.750938113" watchObservedRunningTime="2026-03-07 21:21:56.687467238 +0000 UTC m=+239.758172337" Mar 07 21:21:57.319784 master-0 kubenswrapper[16352]: I0307 21:21:57.317930 16352 scope.go:117] "RemoveContainer" containerID="f11dea03780316a0cd94d2e932a489c49a45b9ec1636336c36582f2f1729ff4b" Mar 07 21:21:57.334773 master-0 kubenswrapper[16352]: I0307 21:21:57.334653 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:21:57.336581 master-0 kubenswrapper[16352]: I0307 21:21:57.336526 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.342115 master-0 kubenswrapper[16352]: I0307 21:21:57.341957 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-hqsqr" Mar 07 21:21:57.351963 master-0 kubenswrapper[16352]: I0307 21:21:57.351898 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:21:57.450849 master-0 kubenswrapper[16352]: I0307 21:21:57.450302 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrw5d\" (UniqueName: \"kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.450849 master-0 kubenswrapper[16352]: I0307 21:21:57.450536 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.552762 master-0 kubenswrapper[16352]: I0307 21:21:57.552697 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrw5d\" (UniqueName: \"kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.553054 master-0 kubenswrapper[16352]: I0307 21:21:57.552776 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.557547 master-0 kubenswrapper[16352]: I0307 21:21:57.557481 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.572241 master-0 kubenswrapper[16352]: I0307 21:21:57.572081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrw5d\" (UniqueName: \"kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d\") pod \"multus-admission-controller-56bbfd46b8-6qcf8\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:57.683898 master-0 kubenswrapper[16352]: I0307 21:21:57.683811 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:21:58.175676 master-0 kubenswrapper[16352]: I0307 21:21:58.175530 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:21:58.508810 master-0 kubenswrapper[16352]: E0307 21:21:58.508723 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:21:58.510587 master-0 kubenswrapper[16352]: E0307 21:21:58.510357 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:21:58.512250 master-0 kubenswrapper[16352]: E0307 21:21:58.512182 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:21:58.512312 master-0 kubenswrapper[16352]: E0307 21:21:58.512261 16352 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:21:58.672046 master-0 kubenswrapper[16352]: I0307 21:21:58.671949 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerStarted","Data":"932e77a95a266f3a49729a833c9467a215cc08ba8594088722b1b8a34b918e54"} Mar 07 21:21:58.672046 master-0 kubenswrapper[16352]: I0307 21:21:58.672027 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerStarted","Data":"fc13c84fddc8b39e3ae583ab46f78d799260ed9607b8c779835700e3973fc081"} Mar 07 21:21:59.686385 master-0 kubenswrapper[16352]: I0307 21:21:59.686291 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerStarted","Data":"95a50f03c73d26087cb603eac561fb93e94820fd631d9ebde8bc2aec42f081ec"} Mar 07 21:21:59.716181 master-0 kubenswrapper[16352]: I0307 21:21:59.716053 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" podStartSLOduration=2.716011225 podStartE2EDuration="2.716011225s" podCreationTimestamp="2026-03-07 21:21:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:21:59.708260088 +0000 UTC m=+242.778965187" watchObservedRunningTime="2026-03-07 21:21:59.716011225 +0000 UTC m=+242.786716374" Mar 07 21:21:59.760639 master-0 kubenswrapper[16352]: I0307 21:21:59.760561 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:21:59.761156 master-0 kubenswrapper[16352]: I0307 21:21:59.761093 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="multus-admission-controller" containerID="cri-o://d0c8f910f29b908238dbc63bf9ac7b0f87a9546eaf7538fe52110d4fc58afa92" gracePeriod=30 Mar 07 21:21:59.761156 master-0 kubenswrapper[16352]: I0307 21:21:59.761134 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="kube-rbac-proxy" containerID="cri-o://3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6" gracePeriod=30 Mar 07 21:22:00.699711 master-0 kubenswrapper[16352]: I0307 21:22:00.699569 16352 generic.go:334] "Generic (PLEG): container finished" podID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerID="3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6" exitCode=0 Mar 07 21:22:00.700590 master-0 kubenswrapper[16352]: I0307 21:22:00.700458 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerDied","Data":"3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6"} Mar 07 21:22:05.257306 master-0 kubenswrapper[16352]: I0307 21:22:05.257229 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j"] Mar 07 21:22:05.258500 master-0 kubenswrapper[16352]: I0307 21:22:05.258462 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.261719 master-0 kubenswrapper[16352]: I0307 21:22:05.261624 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 21:22:05.262087 master-0 kubenswrapper[16352]: I0307 21:22:05.262039 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 21:22:05.262267 master-0 kubenswrapper[16352]: I0307 21:22:05.262189 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 21:22:05.262673 master-0 kubenswrapper[16352]: I0307 21:22:05.262594 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 21:22:05.267371 master-0 kubenswrapper[16352]: I0307 21:22:05.267300 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 21:22:05.267666 master-0 kubenswrapper[16352]: I0307 21:22:05.267582 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 21:22:05.267917 master-0 kubenswrapper[16352]: I0307 21:22:05.267878 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 21:22:05.268042 master-0 kubenswrapper[16352]: I0307 21:22:05.267991 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 21:22:05.268147 master-0 kubenswrapper[16352]: I0307 21:22:05.268039 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 21:22:05.268981 master-0 kubenswrapper[16352]: I0307 21:22:05.268869 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 21:22:05.269956 master-0 kubenswrapper[16352]: I0307 21:22:05.269901 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 21:22:05.271605 master-0 kubenswrapper[16352]: I0307 21:22:05.271549 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-x8pfn" Mar 07 21:22:05.298620 master-0 kubenswrapper[16352]: I0307 21:22:05.298543 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 21:22:05.299209 master-0 kubenswrapper[16352]: I0307 21:22:05.299143 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 21:22:05.301052 master-0 kubenswrapper[16352]: I0307 21:22:05.300970 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j"] Mar 07 21:22:05.338521 master-0 kubenswrapper[16352]: I0307 21:22:05.338429 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.338798 master-0 kubenswrapper[16352]: I0307 21:22:05.338542 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.338798 master-0 kubenswrapper[16352]: I0307 21:22:05.338606 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.338798 master-0 kubenswrapper[16352]: I0307 21:22:05.338649 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.338997 master-0 kubenswrapper[16352]: I0307 21:22:05.338951 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339050 master-0 kubenswrapper[16352]: I0307 21:22:05.339008 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339050 master-0 kubenswrapper[16352]: I0307 21:22:05.339044 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339136 master-0 kubenswrapper[16352]: I0307 21:22:05.339104 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339439 master-0 kubenswrapper[16352]: I0307 21:22:05.339384 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339497 master-0 kubenswrapper[16352]: I0307 21:22:05.339446 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cvrv\" (UniqueName: \"kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339563 master-0 kubenswrapper[16352]: I0307 21:22:05.339537 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339612 master-0 kubenswrapper[16352]: I0307 21:22:05.339565 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.339612 master-0 kubenswrapper[16352]: I0307 21:22:05.339605 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.441764 master-0 kubenswrapper[16352]: I0307 21:22:05.441634 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7cvrv\" (UniqueName: \"kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.442037 master-0 kubenswrapper[16352]: I0307 21:22:05.441899 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.442037 master-0 kubenswrapper[16352]: I0307 21:22:05.441949 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.442037 master-0 kubenswrapper[16352]: I0307 21:22:05.441979 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.442349 master-0 kubenswrapper[16352]: E0307 21:22:05.442112 16352 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 07 21:22:05.442349 master-0 kubenswrapper[16352]: E0307 21:22:05.442192 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:05.942168407 +0000 UTC m=+249.012873476 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : secret "v4-0-config-system-session" not found Mar 07 21:22:05.442559 master-0 kubenswrapper[16352]: I0307 21:22:05.442322 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.442559 master-0 kubenswrapper[16352]: E0307 21:22:05.442431 16352 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:05.442559 master-0 kubenswrapper[16352]: E0307 21:22:05.442527 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:05.942504644 +0000 UTC m=+249.013209693 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:05.443004 master-0 kubenswrapper[16352]: I0307 21:22:05.442557 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443004 master-0 kubenswrapper[16352]: I0307 21:22:05.442719 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443004 master-0 kubenswrapper[16352]: I0307 21:22:05.442802 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443004 master-0 kubenswrapper[16352]: I0307 21:22:05.442878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443418 master-0 kubenswrapper[16352]: I0307 21:22:05.443237 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443418 master-0 kubenswrapper[16352]: I0307 21:22:05.443365 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443622 master-0 kubenswrapper[16352]: I0307 21:22:05.443577 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443797 master-0 kubenswrapper[16352]: I0307 21:22:05.443757 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.443908 master-0 kubenswrapper[16352]: I0307 21:22:05.443773 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.444315 master-0 kubenswrapper[16352]: I0307 21:22:05.444157 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.445562 master-0 kubenswrapper[16352]: I0307 21:22:05.445487 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.445982 master-0 kubenswrapper[16352]: I0307 21:22:05.445912 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.447396 master-0 kubenswrapper[16352]: I0307 21:22:05.447212 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.447396 master-0 kubenswrapper[16352]: I0307 21:22:05.447246 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.450914 master-0 kubenswrapper[16352]: I0307 21:22:05.450819 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.451630 master-0 kubenswrapper[16352]: I0307 21:22:05.451577 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.451993 master-0 kubenswrapper[16352]: I0307 21:22:05.451921 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.452095 master-0 kubenswrapper[16352]: I0307 21:22:05.452005 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.470517 master-0 kubenswrapper[16352]: I0307 21:22:05.470436 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cvrv\" (UniqueName: \"kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.954427 master-0 kubenswrapper[16352]: I0307 21:22:05.954269 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.954843 master-0 kubenswrapper[16352]: E0307 21:22:05.954644 16352 secret.go:189] Couldn't get secret openshift-authentication/v4-0-config-system-session: secret "v4-0-config-system-session" not found Mar 07 21:22:05.954973 master-0 kubenswrapper[16352]: I0307 21:22:05.954644 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:05.954973 master-0 kubenswrapper[16352]: E0307 21:22:05.954945 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:06.954897243 +0000 UTC m=+250.025602342 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-session" (UniqueName: "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : secret "v4-0-config-system-session" not found Mar 07 21:22:05.954973 master-0 kubenswrapper[16352]: E0307 21:22:05.954758 16352 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:05.955236 master-0 kubenswrapper[16352]: E0307 21:22:05.955079 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:06.955041106 +0000 UTC m=+250.025746265 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:06.975271 master-0 kubenswrapper[16352]: I0307 21:22:06.975160 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:06.975271 master-0 kubenswrapper[16352]: I0307 21:22:06.975259 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:06.976155 master-0 kubenswrapper[16352]: E0307 21:22:06.975510 16352 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:06.976155 master-0 kubenswrapper[16352]: E0307 21:22:06.975720 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:08.975658463 +0000 UTC m=+252.046363552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:06.981201 master-0 kubenswrapper[16352]: I0307 21:22:06.981123 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:08.508051 master-0 kubenswrapper[16352]: E0307 21:22:08.507913 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:08.510127 master-0 kubenswrapper[16352]: E0307 21:22:08.510050 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:08.511343 master-0 kubenswrapper[16352]: E0307 21:22:08.511298 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:08.511343 master-0 kubenswrapper[16352]: E0307 21:22:08.511332 16352 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:22:09.015526 master-0 kubenswrapper[16352]: I0307 21:22:09.015393 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:09.015970 master-0 kubenswrapper[16352]: E0307 21:22:09.015670 16352 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:09.015970 master-0 kubenswrapper[16352]: E0307 21:22:09.015835 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:13.015804112 +0000 UTC m=+256.086509211 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:09.978140 master-0 kubenswrapper[16352]: I0307 21:22:09.978024 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:09.980323 master-0 kubenswrapper[16352]: I0307 21:22:09.980255 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:09.984443 master-0 kubenswrapper[16352]: I0307 21:22:09.984390 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-v4v2q" Mar 07 21:22:09.984620 master-0 kubenswrapper[16352]: I0307 21:22:09.984489 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 21:22:09.984620 master-0 kubenswrapper[16352]: I0307 21:22:09.984495 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:10.137013 master-0 kubenswrapper[16352]: I0307 21:22:10.136887 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.137470 master-0 kubenswrapper[16352]: I0307 21:22:10.137053 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.137470 master-0 kubenswrapper[16352]: I0307 21:22:10.137373 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.239168 master-0 kubenswrapper[16352]: I0307 21:22:10.238954 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.239504 master-0 kubenswrapper[16352]: I0307 21:22:10.239205 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.239504 master-0 kubenswrapper[16352]: I0307 21:22:10.239193 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.240285 master-0 kubenswrapper[16352]: I0307 21:22:10.240226 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.240442 master-0 kubenswrapper[16352]: I0307 21:22:10.240352 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.274847 master-0 kubenswrapper[16352]: I0307 21:22:10.274751 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access\") pod \"installer-2-master-0\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.329115 master-0 kubenswrapper[16352]: I0307 21:22:10.329003 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:10.871088 master-0 kubenswrapper[16352]: I0307 21:22:10.871005 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:11.810563 master-0 kubenswrapper[16352]: I0307 21:22:11.810462 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c311d423-1179-467a-a50c-3e38e5d6e5ed","Type":"ContainerStarted","Data":"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59"} Mar 07 21:22:11.810563 master-0 kubenswrapper[16352]: I0307 21:22:11.810543 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c311d423-1179-467a-a50c-3e38e5d6e5ed","Type":"ContainerStarted","Data":"16b6a716cff70f743c159337f640fae48c79962073b4c024c5cc09d587c4526e"} Mar 07 21:22:11.840760 master-0 kubenswrapper[16352]: I0307 21:22:11.840624 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-2-master-0" podStartSLOduration=2.840591253 podStartE2EDuration="2.840591253s" podCreationTimestamp="2026-03-07 21:22:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:22:11.839373524 +0000 UTC m=+254.910078643" watchObservedRunningTime="2026-03-07 21:22:11.840591253 +0000 UTC m=+254.911296352" Mar 07 21:22:13.108712 master-0 kubenswrapper[16352]: I0307 21:22:13.108579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6c8ccbd44d-m8w7j\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:13.109466 master-0 kubenswrapper[16352]: E0307 21:22:13.109033 16352 configmap.go:193] Couldn't get configMap openshift-authentication/v4-0-config-system-cliconfig: configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:13.109466 master-0 kubenswrapper[16352]: E0307 21:22:13.109284 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig podName:565ef599-985b-4308-8393-1c40e3f37868 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:21.109227282 +0000 UTC m=+264.179932511 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "v4-0-config-system-cliconfig" (UniqueName: "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig") pod "oauth-openshift-6c8ccbd44d-m8w7j" (UID: "565ef599-985b-4308-8393-1c40e3f37868") : configmap "v4-0-config-system-cliconfig" not found Mar 07 21:22:13.519127 master-0 kubenswrapper[16352]: I0307 21:22:13.518904 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:22:13.524653 master-0 kubenswrapper[16352]: I0307 21:22:13.524580 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"installer-1-retry-1-master-0\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " pod="openshift-kube-apiserver/installer-1-retry-1-master-0" Mar 07 21:22:13.608585 master-0 kubenswrapper[16352]: E0307 21:22:13.608430 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[trusted-ca], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" podUID="28fd6ebb-51ac-4763-99b2-3a94b124d059" Mar 07 21:22:13.722415 master-0 kubenswrapper[16352]: I0307 21:22:13.722321 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") pod \"2357c135-5d09-4657-9038-48d25ed55b2d\" (UID: \"2357c135-5d09-4657-9038-48d25ed55b2d\") " Mar 07 21:22:13.727372 master-0 kubenswrapper[16352]: I0307 21:22:13.727117 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2357c135-5d09-4657-9038-48d25ed55b2d" (UID: "2357c135-5d09-4657-9038-48d25ed55b2d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:13.825242 master-0 kubenswrapper[16352]: I0307 21:22:13.825057 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2357c135-5d09-4657-9038-48d25ed55b2d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:13.845048 master-0 kubenswrapper[16352]: I0307 21:22:13.844960 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:14.536765 master-0 kubenswrapper[16352]: I0307 21:22:14.536590 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2"] Mar 07 21:22:14.538572 master-0 kubenswrapper[16352]: I0307 21:22:14.538503 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:14.541136 master-0 kubenswrapper[16352]: I0307 21:22:14.541053 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:14.541136 master-0 kubenswrapper[16352]: I0307 21:22:14.541132 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c92ce0ee-5758-4a49-a811-d49ab5309a38-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:14.541398 master-0 kubenswrapper[16352]: I0307 21:22:14.541304 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 21:22:14.543657 master-0 kubenswrapper[16352]: I0307 21:22:14.543612 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 21:22:14.544073 master-0 kubenswrapper[16352]: I0307 21:22:14.544037 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-tpl92" Mar 07 21:22:14.554347 master-0 kubenswrapper[16352]: I0307 21:22:14.554251 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2"] Mar 07 21:22:14.644055 master-0 kubenswrapper[16352]: I0307 21:22:14.643303 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c92ce0ee-5758-4a49-a811-d49ab5309a38-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:14.644055 master-0 kubenswrapper[16352]: I0307 21:22:14.643535 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:14.644055 master-0 kubenswrapper[16352]: E0307 21:22:14.643716 16352 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Mar 07 21:22:14.644055 master-0 kubenswrapper[16352]: E0307 21:22:14.643783 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert podName:c92ce0ee-5758-4a49-a811-d49ab5309a38 nodeName:}" failed. No retries permitted until 2026-03-07 21:22:15.143764855 +0000 UTC m=+258.214469934 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert") pod "networking-console-plugin-5cbd49d755-2lmd2" (UID: "c92ce0ee-5758-4a49-a811-d49ab5309a38") : secret "networking-console-plugin-cert" not found Mar 07 21:22:14.645191 master-0 kubenswrapper[16352]: I0307 21:22:14.645122 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/c92ce0ee-5758-4a49-a811-d49ab5309a38-nginx-conf\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:15.155654 master-0 kubenswrapper[16352]: I0307 21:22:15.155527 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:15.162812 master-0 kubenswrapper[16352]: I0307 21:22:15.162728 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/c92ce0ee-5758-4a49-a811-d49ab5309a38-networking-console-plugin-cert\") pod \"networking-console-plugin-5cbd49d755-2lmd2\" (UID: \"c92ce0ee-5758-4a49-a811-d49ab5309a38\") " pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:15.172821 master-0 kubenswrapper[16352]: I0307 21:22:15.172734 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" Mar 07 21:22:15.673407 master-0 kubenswrapper[16352]: I0307 21:22:15.673301 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2"] Mar 07 21:22:15.873954 master-0 kubenswrapper[16352]: I0307 21:22:15.873840 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" event={"ID":"c92ce0ee-5758-4a49-a811-d49ab5309a38","Type":"ContainerStarted","Data":"0f7343e79c5065937f87aae952b833de3ff8eb8a248d6dc026b9e4d53ef9f655"} Mar 07 21:22:16.450329 master-0 kubenswrapper[16352]: I0307 21:22:16.450235 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j"] Mar 07 21:22:16.450934 master-0 kubenswrapper[16352]: E0307 21:22:16.450889 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[v4-0-config-system-cliconfig], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" podUID="565ef599-985b-4308-8393-1c40e3f37868" Mar 07 21:22:16.883326 master-0 kubenswrapper[16352]: I0307 21:22:16.883219 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:16.898402 master-0 kubenswrapper[16352]: I0307 21:22:16.898336 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:17.099640 master-0 kubenswrapper[16352]: I0307 21:22:17.099572 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.099762 master-0 kubenswrapper[16352]: I0307 21:22:17.099742 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7cvrv\" (UniqueName: \"kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.099832 master-0 kubenswrapper[16352]: I0307 21:22:17.099812 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.099913 master-0 kubenswrapper[16352]: I0307 21:22:17.099870 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.099968 master-0 kubenswrapper[16352]: I0307 21:22:17.099939 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.100015 master-0 kubenswrapper[16352]: I0307 21:22:17.099972 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.100064 master-0 kubenswrapper[16352]: I0307 21:22:17.100028 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.100526 master-0 kubenswrapper[16352]: I0307 21:22:17.100221 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:17.100526 master-0 kubenswrapper[16352]: I0307 21:22:17.100368 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.100526 master-0 kubenswrapper[16352]: I0307 21:22:17.100420 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.101152 master-0 kubenswrapper[16352]: I0307 21:22:17.100593 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.101152 master-0 kubenswrapper[16352]: I0307 21:22:17.100641 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.101152 master-0 kubenswrapper[16352]: I0307 21:22:17.100727 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection\") pod \"565ef599-985b-4308-8393-1c40e3f37868\" (UID: \"565ef599-985b-4308-8393-1c40e3f37868\") " Mar 07 21:22:17.101152 master-0 kubenswrapper[16352]: I0307 21:22:17.101130 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:22:17.101591 master-0 kubenswrapper[16352]: I0307 21:22:17.101387 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:22:17.101948 master-0 kubenswrapper[16352]: I0307 21:22:17.101798 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.101948 master-0 kubenswrapper[16352]: I0307 21:22:17.101906 16352 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.101948 master-0 kubenswrapper[16352]: I0307 21:22:17.101939 16352 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/565ef599-985b-4308-8393-1c40e3f37868-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.102285 master-0 kubenswrapper[16352]: I0307 21:22:17.101957 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:22:17.103180 master-0 kubenswrapper[16352]: I0307 21:22:17.103113 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.103376 master-0 kubenswrapper[16352]: I0307 21:22:17.103333 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.104019 master-0 kubenswrapper[16352]: I0307 21:22:17.103973 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.105345 master-0 kubenswrapper[16352]: I0307 21:22:17.105265 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.105498 master-0 kubenswrapper[16352]: I0307 21:22:17.105475 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.105929 master-0 kubenswrapper[16352]: I0307 21:22:17.105857 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv" (OuterVolumeSpecName: "kube-api-access-7cvrv") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "kube-api-access-7cvrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:17.107843 master-0 kubenswrapper[16352]: I0307 21:22:17.107770 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.110051 master-0 kubenswrapper[16352]: I0307 21:22:17.109959 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "565ef599-985b-4308-8393-1c40e3f37868" (UID: "565ef599-985b-4308-8393-1c40e3f37868"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:17.207370 master-0 kubenswrapper[16352]: I0307 21:22:17.207191 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7cvrv\" (UniqueName: \"kubernetes.io/projected/565ef599-985b-4308-8393-1c40e3f37868-kube-api-access-7cvrv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207370 master-0 kubenswrapper[16352]: I0307 21:22:17.207375 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207844 master-0 kubenswrapper[16352]: I0307 21:22:17.207434 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207844 master-0 kubenswrapper[16352]: I0307 21:22:17.207490 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207844 master-0 kubenswrapper[16352]: I0307 21:22:17.207545 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207844 master-0 kubenswrapper[16352]: I0307 21:22:17.207616 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.207844 master-0 kubenswrapper[16352]: I0307 21:22:17.207641 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.208147 master-0 kubenswrapper[16352]: I0307 21:22:17.207880 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.208784 master-0 kubenswrapper[16352]: I0307 21:22:17.208124 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:17.312288 master-0 kubenswrapper[16352]: I0307 21:22:17.312220 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:17.314098 master-0 kubenswrapper[16352]: I0307 21:22:17.314041 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/28fd6ebb-51ac-4763-99b2-3a94b124d059-trusted-ca\") pod \"console-operator-6c7fb6b958-2grlf\" (UID: \"28fd6ebb-51ac-4763-99b2-3a94b124d059\") " pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:17.446579 master-0 kubenswrapper[16352]: I0307 21:22:17.446367 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:17.893498 master-0 kubenswrapper[16352]: I0307 21:22:17.893395 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" event={"ID":"c92ce0ee-5758-4a49-a811-d49ab5309a38","Type":"ContainerStarted","Data":"0b9d6e6de0e01af967853c8ceeb10c6b44f45a25ea49cc6360d9284bb50dd527"} Mar 07 21:22:17.894106 master-0 kubenswrapper[16352]: I0307 21:22:17.893449 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j" Mar 07 21:22:17.924519 master-0 kubenswrapper[16352]: I0307 21:22:17.924443 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-6c7fb6b958-2grlf"] Mar 07 21:22:17.925644 master-0 kubenswrapper[16352]: I0307 21:22:17.925532 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-5cbd49d755-2lmd2" podStartSLOduration=2.513626855 podStartE2EDuration="3.925513609s" podCreationTimestamp="2026-03-07 21:22:14 +0000 UTC" firstStartedPulling="2026-03-07 21:22:15.688949314 +0000 UTC m=+258.759654413" lastFinishedPulling="2026-03-07 21:22:17.100836118 +0000 UTC m=+260.171541167" observedRunningTime="2026-03-07 21:22:17.921475921 +0000 UTC m=+260.992181020" watchObservedRunningTime="2026-03-07 21:22:17.925513609 +0000 UTC m=+260.996218708" Mar 07 21:22:17.996509 master-0 kubenswrapper[16352]: I0307 21:22:17.996443 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:22:17.997972 master-0 kubenswrapper[16352]: I0307 21:22:17.997934 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.000145 master-0 kubenswrapper[16352]: I0307 21:22:18.000094 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j"] Mar 07 21:22:18.001214 master-0 kubenswrapper[16352]: I0307 21:22:18.001150 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 21:22:18.002408 master-0 kubenswrapper[16352]: I0307 21:22:18.002346 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 21:22:18.002817 master-0 kubenswrapper[16352]: I0307 21:22:18.002773 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 21:22:18.003015 master-0 kubenswrapper[16352]: I0307 21:22:18.002984 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 21:22:18.004672 master-0 kubenswrapper[16352]: I0307 21:22:18.004620 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 21:22:18.005084 master-0 kubenswrapper[16352]: I0307 21:22:18.005054 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 21:22:18.005276 master-0 kubenswrapper[16352]: I0307 21:22:18.005248 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 21:22:18.005370 master-0 kubenswrapper[16352]: I0307 21:22:18.005304 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-x8pfn" Mar 07 21:22:18.005649 master-0 kubenswrapper[16352]: I0307 21:22:18.005608 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 21:22:18.005857 master-0 kubenswrapper[16352]: I0307 21:22:18.005831 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 21:22:18.007052 master-0 kubenswrapper[16352]: I0307 21:22:18.007019 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 21:22:18.007617 master-0 kubenswrapper[16352]: I0307 21:22:18.007575 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 21:22:18.020125 master-0 kubenswrapper[16352]: I0307 21:22:18.015655 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 21:22:18.020125 master-0 kubenswrapper[16352]: I0307 21:22:18.016036 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-6c8ccbd44d-m8w7j"] Mar 07 21:22:18.020596 master-0 kubenswrapper[16352]: I0307 21:22:18.020547 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:22:18.029161 master-0 kubenswrapper[16352]: I0307 21:22:18.029059 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029381 master-0 kubenswrapper[16352]: I0307 21:22:18.029171 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029381 master-0 kubenswrapper[16352]: I0307 21:22:18.029356 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029486 master-0 kubenswrapper[16352]: I0307 21:22:18.029387 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029535 master-0 kubenswrapper[16352]: I0307 21:22:18.029467 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029604 master-0 kubenswrapper[16352]: I0307 21:22:18.029539 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029604 master-0 kubenswrapper[16352]: I0307 21:22:18.029573 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029748 master-0 kubenswrapper[16352]: I0307 21:22:18.029609 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029748 master-0 kubenswrapper[16352]: I0307 21:22:18.029640 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029748 master-0 kubenswrapper[16352]: I0307 21:22:18.029722 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029874 master-0 kubenswrapper[16352]: I0307 21:22:18.029767 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029874 master-0 kubenswrapper[16352]: I0307 21:22:18.029804 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.029874 master-0 kubenswrapper[16352]: I0307 21:22:18.029841 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4lqg9\" (UniqueName: \"kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.030006 master-0 kubenswrapper[16352]: I0307 21:22:18.029912 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/565ef599-985b-4308-8393-1c40e3f37868-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:18.051622 master-0 kubenswrapper[16352]: I0307 21:22:18.051522 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 21:22:18.131604 master-0 kubenswrapper[16352]: I0307 21:22:18.131518 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.131604 master-0 kubenswrapper[16352]: I0307 21:22:18.131579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131759 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131792 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131814 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131837 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131856 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131892 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131920 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131938 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131965 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4lqg9\" (UniqueName: \"kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.131993 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132044 master-0 kubenswrapper[16352]: I0307 21:22:18.132029 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132875 master-0 kubenswrapper[16352]: I0307 21:22:18.132721 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.132947 master-0 kubenswrapper[16352]: I0307 21:22:18.132905 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.133338 master-0 kubenswrapper[16352]: I0307 21:22:18.133267 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.133416 master-0 kubenswrapper[16352]: I0307 21:22:18.133306 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.133605 master-0 kubenswrapper[16352]: I0307 21:22:18.133538 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.135176 master-0 kubenswrapper[16352]: I0307 21:22:18.135132 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.136656 master-0 kubenswrapper[16352]: I0307 21:22:18.136597 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.136808 master-0 kubenswrapper[16352]: I0307 21:22:18.136614 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.136808 master-0 kubenswrapper[16352]: I0307 21:22:18.136804 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.137893 master-0 kubenswrapper[16352]: I0307 21:22:18.137808 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.138016 master-0 kubenswrapper[16352]: I0307 21:22:18.137917 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.138016 master-0 kubenswrapper[16352]: I0307 21:22:18.137987 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.151298 master-0 kubenswrapper[16352]: I0307 21:22:18.151189 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4lqg9\" (UniqueName: \"kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9\") pod \"oauth-openshift-67c6dd6955-hbksv\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.337218 master-0 kubenswrapper[16352]: I0307 21:22:18.337118 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:18.510332 master-0 kubenswrapper[16352]: E0307 21:22:18.510255 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:18.514736 master-0 kubenswrapper[16352]: E0307 21:22:18.514176 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:18.521002 master-0 kubenswrapper[16352]: E0307 21:22:18.520917 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" cmd=["/bin/bash","-c","test -f /ready/ready"] Mar 07 21:22:18.521080 master-0 kubenswrapper[16352]: E0307 21:22:18.521056 16352 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:22:18.821500 master-0 kubenswrapper[16352]: I0307 21:22:18.821310 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:22:18.907538 master-0 kubenswrapper[16352]: I0307 21:22:18.907393 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" event={"ID":"e7e05fee-e85a-41b9-b3cb-64658e71bf1a","Type":"ContainerStarted","Data":"abea292beb2f62f6cb8c121335eac995a6de858aeef5197b723a613afbc0ad5e"} Mar 07 21:22:18.909385 master-0 kubenswrapper[16352]: I0307 21:22:18.909295 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" event={"ID":"28fd6ebb-51ac-4763-99b2-3a94b124d059","Type":"ContainerStarted","Data":"b59ad258083df94117f66e2f22a7fa0338415ef2b6011ae47e642e3f5673d11c"} Mar 07 21:22:19.202720 master-0 kubenswrapper[16352]: I0307 21:22:19.202613 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="565ef599-985b-4308-8393-1c40e3f37868" path="/var/lib/kubelet/pods/565ef599-985b-4308-8393-1c40e3f37868/volumes" Mar 07 21:22:20.748155 master-0 kubenswrapper[16352]: I0307 21:22:20.747945 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 07 21:22:20.749035 master-0 kubenswrapper[16352]: I0307 21:22:20.748244 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" containerID="cri-o://b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7" gracePeriod=30 Mar 07 21:22:20.749289 master-0 kubenswrapper[16352]: I0307 21:22:20.749145 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:22:20.749548 master-0 kubenswrapper[16352]: E0307 21:22:20.749498 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.749548 master-0 kubenswrapper[16352]: I0307 21:22:20.749524 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.749548 master-0 kubenswrapper[16352]: E0307 21:22:20.749546 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.749548 master-0 kubenswrapper[16352]: I0307 21:22:20.749555 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.749851 master-0 kubenswrapper[16352]: I0307 21:22:20.749764 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.749851 master-0 kubenswrapper[16352]: I0307 21:22:20.749785 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1a56802af72ce1aac6b5077f1695ac0" containerName="kube-scheduler" Mar 07 21:22:20.751291 master-0 kubenswrapper[16352]: I0307 21:22:20.751231 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.779302 master-0 kubenswrapper[16352]: I0307 21:22:20.779218 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.779432 master-0 kubenswrapper[16352]: I0307 21:22:20.779402 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.889927 master-0 kubenswrapper[16352]: I0307 21:22:20.887270 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.889927 master-0 kubenswrapper[16352]: I0307 21:22:20.887404 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.889927 master-0 kubenswrapper[16352]: I0307 21:22:20.887532 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.889927 master-0 kubenswrapper[16352]: I0307 21:22:20.887589 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.909014 master-0 kubenswrapper[16352]: I0307 21:22:20.908241 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:20.929706 master-0 kubenswrapper[16352]: I0307 21:22:20.926298 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:22:20.963711 master-0 kubenswrapper[16352]: I0307 21:22:20.962898 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" event={"ID":"28fd6ebb-51ac-4763-99b2-3a94b124d059","Type":"ContainerStarted","Data":"3ff190bcde58938fa03f0cc8aa380bca173f31298a6ed25701f44e2da028c452"} Mar 07 21:22:20.963953 master-0 kubenswrapper[16352]: I0307 21:22:20.963928 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:20.997580 master-0 kubenswrapper[16352]: I0307 21:22:20.997466 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" podStartSLOduration=251.499755189 podStartE2EDuration="4m13.997445952s" podCreationTimestamp="2026-03-07 21:18:07 +0000 UTC" firstStartedPulling="2026-03-07 21:22:17.968846094 +0000 UTC m=+261.039551153" lastFinishedPulling="2026-03-07 21:22:20.466536857 +0000 UTC m=+263.537241916" observedRunningTime="2026-03-07 21:22:20.993333793 +0000 UTC m=+264.064038852" watchObservedRunningTime="2026-03-07 21:22:20.997445952 +0000 UTC m=+264.068151011" Mar 07 21:22:21.245511 master-0 kubenswrapper[16352]: I0307 21:22:21.245436 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-6c7fb6b958-2grlf" Mar 07 21:22:21.702041 master-0 kubenswrapper[16352]: I0307 21:22:21.701920 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:22:21.707458 master-0 kubenswrapper[16352]: I0307 21:22:21.707428 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-2hhhs_f05b2327-b1ca-4b9b-a167-68f9fcb506e6/kube-multus-additional-cni-plugins/0.log" Mar 07 21:22:21.707458 master-0 kubenswrapper[16352]: I0307 21:22:21.707488 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:22:21.750452 master-0 kubenswrapper[16352]: E0307 21:22:21.750350 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-conmon-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-conmon-b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-conmon-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:22:21.750452 master-0 kubenswrapper[16352]: E0307 21:22:21.750411 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982319eb_2dc2_4faa_85d8_ee11840179fd.slice/crio-conmon-3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982319eb_2dc2_4faa_85d8_ee11840179fd.slice/crio-3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-conmon-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-conmon-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:22:21.751033 master-0 kubenswrapper[16352]: E0307 21:22:21.750891 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-conmon-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-conmon-b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1a56802af72ce1aac6b5077f1695ac0.slice/crio-b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982319eb_2dc2_4faa_85d8_ee11840179fd.slice/crio-3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-conmon-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod2aadcdd5_aa40_442c_9434_97f150dddf70.slice/crio-df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod982319eb_2dc2_4faa_85d8_ee11840179fd.slice/crio-conmon-3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf05b2327_b1ca_4b9b_a167_68f9fcb506e6.slice/crio-f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:22:21.812657 master-0 kubenswrapper[16352]: I0307 21:22:21.812602 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 07 21:22:21.812799 master-0 kubenswrapper[16352]: I0307 21:22:21.812738 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets" (OuterVolumeSpecName: "secrets") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:21.812799 master-0 kubenswrapper[16352]: I0307 21:22:21.812789 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") pod \"a1a56802af72ce1aac6b5077f1695ac0\" (UID: \"a1a56802af72ce1aac6b5077f1695ac0\") " Mar 07 21:22:21.812897 master-0 kubenswrapper[16352]: I0307 21:22:21.812864 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs" (OuterVolumeSpecName: "logs") pod "a1a56802af72ce1aac6b5077f1695ac0" (UID: "a1a56802af72ce1aac6b5077f1695ac0"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:21.812897 master-0 kubenswrapper[16352]: I0307 21:22:21.812891 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready\") pod \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " Mar 07 21:22:21.812996 master-0 kubenswrapper[16352]: I0307 21:22:21.812925 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8wrr2\" (UniqueName: \"kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2\") pod \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " Mar 07 21:22:21.813049 master-0 kubenswrapper[16352]: I0307 21:22:21.812997 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist\") pod \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " Mar 07 21:22:21.813049 master-0 kubenswrapper[16352]: I0307 21:22:21.813039 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir\") pod \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\" (UID: \"f05b2327-b1ca-4b9b-a167-68f9fcb506e6\") " Mar 07 21:22:21.813392 master-0 kubenswrapper[16352]: I0307 21:22:21.813361 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready" (OuterVolumeSpecName: "ready") pod "f05b2327-b1ca-4b9b-a167-68f9fcb506e6" (UID: "f05b2327-b1ca-4b9b-a167-68f9fcb506e6"). InnerVolumeSpecName "ready". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:22:21.813547 master-0 kubenswrapper[16352]: I0307 21:22:21.813382 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir" (OuterVolumeSpecName: "tuning-conf-dir") pod "f05b2327-b1ca-4b9b-a167-68f9fcb506e6" (UID: "f05b2327-b1ca-4b9b-a167-68f9fcb506e6"). InnerVolumeSpecName "tuning-conf-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:21.813769 master-0 kubenswrapper[16352]: I0307 21:22:21.813742 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "f05b2327-b1ca-4b9b-a167-68f9fcb506e6" (UID: "f05b2327-b1ca-4b9b-a167-68f9fcb506e6"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:22:21.814127 master-0 kubenswrapper[16352]: I0307 21:22:21.814104 16352 reconciler_common.go:293] "Volume detached for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-ready\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.814204 master-0 kubenswrapper[16352]: I0307 21:22:21.814191 16352 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-cni-sysctl-allowlist\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.814268 master-0 kubenswrapper[16352]: I0307 21:22:21.814257 16352 reconciler_common.go:293] "Volume detached for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-tuning-conf-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.814334 master-0 kubenswrapper[16352]: I0307 21:22:21.814324 16352 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-secrets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.814402 master-0 kubenswrapper[16352]: I0307 21:22:21.814392 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/a1a56802af72ce1aac6b5077f1695ac0-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.816959 master-0 kubenswrapper[16352]: I0307 21:22:21.816910 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2" (OuterVolumeSpecName: "kube-api-access-8wrr2") pod "f05b2327-b1ca-4b9b-a167-68f9fcb506e6" (UID: "f05b2327-b1ca-4b9b-a167-68f9fcb506e6"). InnerVolumeSpecName "kube-api-access-8wrr2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:21.916482 master-0 kubenswrapper[16352]: I0307 21:22:21.916421 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8wrr2\" (UniqueName: \"kubernetes.io/projected/f05b2327-b1ca-4b9b-a167-68f9fcb506e6-kube-api-access-8wrr2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:21.973588 master-0 kubenswrapper[16352]: I0307 21:22:21.973520 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="801c0e6645c48e23de0745ca7de89bfebe070d2e4b76a9fdb72366ccb7b3154a" exitCode=0 Mar 07 21:22:21.974054 master-0 kubenswrapper[16352]: I0307 21:22:21.973657 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerDied","Data":"801c0e6645c48e23de0745ca7de89bfebe070d2e4b76a9fdb72366ccb7b3154a"} Mar 07 21:22:21.974223 master-0 kubenswrapper[16352]: I0307 21:22:21.974191 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"6ec220a16a4414ebbd87f4036017b07cef5c4b07da82f02aa544a2d6c79d687e"} Mar 07 21:22:21.976469 master-0 kubenswrapper[16352]: I0307 21:22:21.976414 16352 generic.go:334] "Generic (PLEG): container finished" podID="a1a56802af72ce1aac6b5077f1695ac0" containerID="b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7" exitCode=0 Mar 07 21:22:21.976739 master-0 kubenswrapper[16352]: I0307 21:22:21.976524 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-scheduler-master-0" Mar 07 21:22:21.976739 master-0 kubenswrapper[16352]: I0307 21:22:21.976546 16352 scope.go:117] "RemoveContainer" containerID="b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7" Mar 07 21:22:21.981838 master-0 kubenswrapper[16352]: I0307 21:22:21.981779 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" event={"ID":"e7e05fee-e85a-41b9-b3cb-64658e71bf1a","Type":"ContainerStarted","Data":"180b2e86f8e1962ba6e7528073f3f3f6b9cd411fcee0a25b278b1a7f919e78b5"} Mar 07 21:22:21.984436 master-0 kubenswrapper[16352]: I0307 21:22:21.984395 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:21.988612 master-0 kubenswrapper[16352]: I0307 21:22:21.988568 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_cni-sysctl-allowlist-ds-2hhhs_f05b2327-b1ca-4b9b-a167-68f9fcb506e6/kube-multus-additional-cni-plugins/0.log" Mar 07 21:22:21.988840 master-0 kubenswrapper[16352]: I0307 21:22:21.988638 16352 generic.go:334] "Generic (PLEG): container finished" podID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" exitCode=137 Mar 07 21:22:21.988840 master-0 kubenswrapper[16352]: I0307 21:22:21.988812 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" Mar 07 21:22:21.989595 master-0 kubenswrapper[16352]: I0307 21:22:21.988891 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" event={"ID":"f05b2327-b1ca-4b9b-a167-68f9fcb506e6","Type":"ContainerDied","Data":"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998"} Mar 07 21:22:21.989595 master-0 kubenswrapper[16352]: I0307 21:22:21.988925 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-2hhhs" event={"ID":"f05b2327-b1ca-4b9b-a167-68f9fcb506e6","Type":"ContainerDied","Data":"3423d5650e5761050cb216135ebdea581e304f1e4fda7893df4d396f15fc6692"} Mar 07 21:22:21.990907 master-0 kubenswrapper[16352]: I0307 21:22:21.990859 16352 generic.go:334] "Generic (PLEG): container finished" podID="2aadcdd5-aa40-442c-9434-97f150dddf70" containerID="df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f" exitCode=0 Mar 07 21:22:21.991198 master-0 kubenswrapper[16352]: I0307 21:22:21.991135 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"2aadcdd5-aa40-442c-9434-97f150dddf70","Type":"ContainerDied","Data":"df73146b2a51afa19a084a0e4b19b3be738b8e253711e91a384859fbb782b03f"} Mar 07 21:22:22.034478 master-0 kubenswrapper[16352]: I0307 21:22:22.034428 16352 scope.go:117] "RemoveContainer" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" Mar 07 21:22:22.060602 master-0 kubenswrapper[16352]: I0307 21:22:22.060501 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 07 21:22:22.061133 master-0 kubenswrapper[16352]: I0307 21:22:22.061038 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" containerID="cri-o://280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7" gracePeriod=30 Mar 07 21:22:22.061198 master-0 kubenswrapper[16352]: I0307 21:22:22.061121 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" containerID="cri-o://554ffc5919fe7a46fc0ad2b26594bc2dec62e5f792ce74d74fe8d549af25bf01" gracePeriod=30 Mar 07 21:22:22.061672 master-0 kubenswrapper[16352]: I0307 21:22:22.061628 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:22:22.062298 master-0 kubenswrapper[16352]: E0307 21:22:22.062262 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.062298 master-0 kubenswrapper[16352]: I0307 21:22:22.062291 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.062627 master-0 kubenswrapper[16352]: E0307 21:22:22.062311 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 07 21:22:22.062627 master-0 kubenswrapper[16352]: I0307 21:22:22.062617 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 07 21:22:22.062741 master-0 kubenswrapper[16352]: E0307 21:22:22.062637 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:22:22.062741 master-0 kubenswrapper[16352]: I0307 21:22:22.062648 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:22:22.062877 master-0 kubenswrapper[16352]: I0307 21:22:22.062842 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.062877 master-0 kubenswrapper[16352]: I0307 21:22:22.062861 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" containerName="kube-multus-additional-cni-plugins" Mar 07 21:22:22.062958 master-0 kubenswrapper[16352]: I0307 21:22:22.062882 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="cluster-policy-controller" Mar 07 21:22:22.062958 master-0 kubenswrapper[16352]: I0307 21:22:22.062908 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.063102 master-0 kubenswrapper[16352]: E0307 21:22:22.063070 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.063102 master-0 kubenswrapper[16352]: I0307 21:22:22.063087 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78c05e1499b533b83f091333d61f045" containerName="kube-controller-manager" Mar 07 21:22:22.065572 master-0 kubenswrapper[16352]: I0307 21:22:22.064761 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.091430 master-0 kubenswrapper[16352]: I0307 21:22:22.089941 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" podStartSLOduration=3.33107138 podStartE2EDuration="6.089912032s" podCreationTimestamp="2026-03-07 21:22:16 +0000 UTC" firstStartedPulling="2026-03-07 21:22:18.833191131 +0000 UTC m=+261.903896190" lastFinishedPulling="2026-03-07 21:22:21.592031773 +0000 UTC m=+264.662736842" observedRunningTime="2026-03-07 21:22:22.069089129 +0000 UTC m=+265.139794178" watchObservedRunningTime="2026-03-07 21:22:22.089912032 +0000 UTC m=+265.160617091" Mar 07 21:22:22.114342 master-0 kubenswrapper[16352]: I0307 21:22:22.114277 16352 scope.go:117] "RemoveContainer" containerID="b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7" Mar 07 21:22:22.115598 master-0 kubenswrapper[16352]: E0307 21:22:22.115540 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7\": container with ID starting with b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7 not found: ID does not exist" containerID="b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7" Mar 07 21:22:22.115784 master-0 kubenswrapper[16352]: I0307 21:22:22.115601 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7"} err="failed to get container status \"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7\": rpc error: code = NotFound desc = could not find container \"b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7\": container with ID starting with b7c53a131c67dd9249f14c3855c152865fa0b385765f03450b153fc1ead4cad7 not found: ID does not exist" Mar 07 21:22:22.115784 master-0 kubenswrapper[16352]: I0307 21:22:22.115638 16352 scope.go:117] "RemoveContainer" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" Mar 07 21:22:22.119812 master-0 kubenswrapper[16352]: E0307 21:22:22.119764 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da\": container with ID starting with fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da not found: ID does not exist" containerID="fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da" Mar 07 21:22:22.119812 master-0 kubenswrapper[16352]: I0307 21:22:22.119799 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da"} err="failed to get container status \"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da\": rpc error: code = NotFound desc = could not find container \"fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da\": container with ID starting with fa02467dc6acce98318c96032a64c2916f017ba097ca62832e24bc9490f452da not found: ID does not exist" Mar 07 21:22:22.120027 master-0 kubenswrapper[16352]: I0307 21:22:22.119819 16352 scope.go:117] "RemoveContainer" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" Mar 07 21:22:22.121752 master-0 kubenswrapper[16352]: I0307 21:22:22.121710 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.121871 master-0 kubenswrapper[16352]: I0307 21:22:22.121759 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.125160 master-0 kubenswrapper[16352]: I0307 21:22:22.125093 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-2hhhs"] Mar 07 21:22:22.128512 master-0 kubenswrapper[16352]: I0307 21:22:22.128448 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-2hhhs"] Mar 07 21:22:22.139948 master-0 kubenswrapper[16352]: I0307 21:22:22.139873 16352 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="af3d4973-e740-4d68-84ec-13e9b5ccecbc" Mar 07 21:22:22.211304 master-0 kubenswrapper[16352]: I0307 21:22:22.211155 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:22:22.211487 master-0 kubenswrapper[16352]: I0307 21:22:22.211314 16352 scope.go:117] "RemoveContainer" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" Mar 07 21:22:22.218343 master-0 kubenswrapper[16352]: E0307 21:22:22.218202 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998\": container with ID starting with f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998 not found: ID does not exist" containerID="f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998" Mar 07 21:22:22.218343 master-0 kubenswrapper[16352]: I0307 21:22:22.218282 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998"} err="failed to get container status \"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998\": rpc error: code = NotFound desc = could not find container \"f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998\": container with ID starting with f21b9f85a445014e68b1f939548905a3e432c95936ae347ba803889e7182b998 not found: ID does not exist" Mar 07 21:22:22.228112 master-0 kubenswrapper[16352]: I0307 21:22:22.224125 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.228112 master-0 kubenswrapper[16352]: I0307 21:22:22.224312 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.228112 master-0 kubenswrapper[16352]: I0307 21:22:22.224951 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.228112 master-0 kubenswrapper[16352]: I0307 21:22:22.224973 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.248600 master-0 kubenswrapper[16352]: I0307 21:22:22.248562 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:22:22.271596 master-0 kubenswrapper[16352]: I0307 21:22:22.271516 16352 kubelet.go:2706] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="8b4aa4c4-57fa-480a-9c9b-1706db283604" Mar 07 21:22:22.327385 master-0 kubenswrapper[16352]: I0307 21:22:22.327195 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 07 21:22:22.327385 master-0 kubenswrapper[16352]: I0307 21:22:22.327346 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 07 21:22:22.327385 master-0 kubenswrapper[16352]: I0307 21:22:22.327381 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327445 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327482 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") pod \"f78c05e1499b533b83f091333d61f045\" (UID: \"f78c05e1499b533b83f091333d61f045\") " Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327468 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host" (OuterVolumeSpecName: "ssl-certs-host") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "ssl-certs-host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327527 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets" (OuterVolumeSpecName: "secrets") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "secrets". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327561 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs" (OuterVolumeSpecName: "logs") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:22.327649 master-0 kubenswrapper[16352]: I0307 21:22:22.327573 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config" (OuterVolumeSpecName: "config") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "config". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:22.327918 master-0 kubenswrapper[16352]: I0307 21:22:22.327748 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud" (OuterVolumeSpecName: "etc-kubernetes-cloud") pod "f78c05e1499b533b83f091333d61f045" (UID: "f78c05e1499b533b83f091333d61f045"). InnerVolumeSpecName "etc-kubernetes-cloud". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:22.328027 master-0 kubenswrapper[16352]: I0307 21:22:22.327995 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:22.328027 master-0 kubenswrapper[16352]: I0307 21:22:22.328024 16352 reconciler_common.go:293] "Volume detached for volume \"etc-kubernetes-cloud\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-etc-kubernetes-cloud\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:22.328089 master-0 kubenswrapper[16352]: I0307 21:22:22.328038 16352 reconciler_common.go:293] "Volume detached for volume \"ssl-certs-host\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-ssl-certs-host\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:22.328089 master-0 kubenswrapper[16352]: I0307 21:22:22.328050 16352 reconciler_common.go:293] "Volume detached for volume \"secrets\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-secrets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:22.328089 master-0 kubenswrapper[16352]: I0307 21:22:22.328062 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/host-path/f78c05e1499b533b83f091333d61f045-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:22.437601 master-0 kubenswrapper[16352]: I0307 21:22:22.437164 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:22:22.508725 master-0 kubenswrapper[16352]: I0307 21:22:22.508193 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:22.980842 master-0 kubenswrapper[16352]: I0307 21:22:22.976524 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-84f57b9877-dwqg9"] Mar 07 21:22:22.980842 master-0 kubenswrapper[16352]: I0307 21:22:22.978109 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:22:22.995748 master-0 kubenswrapper[16352]: I0307 21:22:22.991102 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-mkc28" Mar 07 21:22:22.995748 master-0 kubenswrapper[16352]: I0307 21:22:22.991275 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 21:22:22.995748 master-0 kubenswrapper[16352]: I0307 21:22:22.991387 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 21:22:23.011020 master-0 kubenswrapper[16352]: I0307 21:22:23.010916 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-dwqg9"] Mar 07 21:22:23.011563 master-0 kubenswrapper[16352]: I0307 21:22:23.011499 16352 generic.go:334] "Generic (PLEG): container finished" podID="2200306f-7816-4019-a6e1-5847ea5b51b1" containerID="ef97b357626aa1231949078ed219eedc8490c7ce007443d958150cfdc31df36f" exitCode=0 Mar 07 21:22:23.011681 master-0 kubenswrapper[16352]: I0307 21:22:23.011606 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"2200306f-7816-4019-a6e1-5847ea5b51b1","Type":"ContainerDied","Data":"ef97b357626aa1231949078ed219eedc8490c7ce007443d958150cfdc31df36f"} Mar 07 21:22:23.021264 master-0 kubenswrapper[16352]: I0307 21:22:23.019652 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"29fe93c228bd77fd76218d416ea847bb6245a51f2342d94714a2572a13bb2ff1"} Mar 07 21:22:23.021264 master-0 kubenswrapper[16352]: I0307 21:22:23.019713 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"75482995cc5f55d9d7fb4b8a57bf5ec36cbaac14083b2719abeb4a1eb62846bc"} Mar 07 21:22:23.026711 master-0 kubenswrapper[16352]: I0307 21:22:23.026585 16352 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="554ffc5919fe7a46fc0ad2b26594bc2dec62e5f792ce74d74fe8d549af25bf01" exitCode=0 Mar 07 21:22:23.026711 master-0 kubenswrapper[16352]: I0307 21:22:23.026651 16352 generic.go:334] "Generic (PLEG): container finished" podID="f78c05e1499b533b83f091333d61f045" containerID="280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7" exitCode=0 Mar 07 21:22:23.027064 master-0 kubenswrapper[16352]: I0307 21:22:23.026783 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a5268e4b1214eb9120732792c2a482d8940b2b6e9aad29e2c1d552f0b52a5bff" Mar 07 21:22:23.027064 master-0 kubenswrapper[16352]: I0307 21:22:23.026815 16352 scope.go:117] "RemoveContainer" containerID="32662289d8af90c397599c0dd49d964d2e4a4646d5948d19fe021ab31184cd4e" Mar 07 21:22:23.027064 master-0 kubenswrapper[16352]: I0307 21:22:23.026822 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kube-system/bootstrap-kube-controller-manager-master-0" Mar 07 21:22:23.041477 master-0 kubenswrapper[16352]: I0307 21:22:23.036917 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} Mar 07 21:22:23.041477 master-0 kubenswrapper[16352]: I0307 21:22:23.037003 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"fc1c3d096828a345bfeef01d498942dd6490aa95b8f66018ce4d891dbf0bee27"} Mar 07 21:22:23.052383 master-0 kubenswrapper[16352]: I0307 21:22:23.052315 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-msswx\" (UniqueName: \"kubernetes.io/projected/59a192e8-491e-405e-955e-c293b335634d-kube-api-access-msswx\") pod \"downloads-84f57b9877-dwqg9\" (UID: \"59a192e8-491e-405e-955e-c293b335634d\") " pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:22:23.156922 master-0 kubenswrapper[16352]: I0307 21:22:23.155718 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-msswx\" (UniqueName: \"kubernetes.io/projected/59a192e8-491e-405e-955e-c293b335634d-kube-api-access-msswx\") pod \"downloads-84f57b9877-dwqg9\" (UID: \"59a192e8-491e-405e-955e-c293b335634d\") " pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:22:23.178097 master-0 kubenswrapper[16352]: I0307 21:22:23.176054 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-msswx\" (UniqueName: \"kubernetes.io/projected/59a192e8-491e-405e-955e-c293b335634d-kube-api-access-msswx\") pod \"downloads-84f57b9877-dwqg9\" (UID: \"59a192e8-491e-405e-955e-c293b335634d\") " pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:22:23.204095 master-0 kubenswrapper[16352]: I0307 21:22:23.203938 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1a56802af72ce1aac6b5077f1695ac0" path="/var/lib/kubelet/pods/a1a56802af72ce1aac6b5077f1695ac0/volumes" Mar 07 21:22:23.205009 master-0 kubenswrapper[16352]: I0307 21:22:23.204984 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f05b2327-b1ca-4b9b-a167-68f9fcb506e6" path="/var/lib/kubelet/pods/f05b2327-b1ca-4b9b-a167-68f9fcb506e6/volumes" Mar 07 21:22:23.205568 master-0 kubenswrapper[16352]: I0307 21:22:23.205543 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f78c05e1499b533b83f091333d61f045" path="/var/lib/kubelet/pods/f78c05e1499b533b83f091333d61f045/volumes" Mar 07 21:22:23.206076 master-0 kubenswrapper[16352]: I0307 21:22:23.206039 16352 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-scheduler-master-0" podUID="" Mar 07 21:22:23.224140 master-0 kubenswrapper[16352]: I0307 21:22:23.224063 16352 mirror_client.go:130] "Deleting a mirror pod" pod="kube-system/bootstrap-kube-controller-manager-master-0" podUID="" Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246488 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246553 16352 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="af3d4973-e740-4d68-84ec-13e9b5ccecbc" Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246655 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-scheduler-master-0"] Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246713 16352 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-scheduler-master-0" mirrorPodUID="af3d4973-e740-4d68-84ec-13e9b5ccecbc" Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246727 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 07 21:22:23.246745 master-0 kubenswrapper[16352]: I0307 21:22:23.246738 16352 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="8b4aa4c4-57fa-480a-9c9b-1706db283604" Mar 07 21:22:23.252091 master-0 kubenswrapper[16352]: I0307 21:22:23.252026 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["kube-system/bootstrap-kube-controller-manager-master-0"] Mar 07 21:22:23.252159 master-0 kubenswrapper[16352]: I0307 21:22:23.252094 16352 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="kube-system/bootstrap-kube-controller-manager-master-0" mirrorPodUID="8b4aa4c4-57fa-480a-9c9b-1706db283604" Mar 07 21:22:23.322792 master-0 kubenswrapper[16352]: I0307 21:22:23.319215 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:22:23.326360 master-0 kubenswrapper[16352]: I0307 21:22:23.326325 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:22:23.464079 master-0 kubenswrapper[16352]: I0307 21:22:23.463859 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock\") pod \"2aadcdd5-aa40-442c-9434-97f150dddf70\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " Mar 07 21:22:23.464079 master-0 kubenswrapper[16352]: I0307 21:22:23.463937 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access\") pod \"2aadcdd5-aa40-442c-9434-97f150dddf70\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " Mar 07 21:22:23.464079 master-0 kubenswrapper[16352]: I0307 21:22:23.463963 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock" (OuterVolumeSpecName: "var-lock") pod "2aadcdd5-aa40-442c-9434-97f150dddf70" (UID: "2aadcdd5-aa40-442c-9434-97f150dddf70"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:23.464079 master-0 kubenswrapper[16352]: I0307 21:22:23.463988 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir\") pod \"2aadcdd5-aa40-442c-9434-97f150dddf70\" (UID: \"2aadcdd5-aa40-442c-9434-97f150dddf70\") " Mar 07 21:22:23.464079 master-0 kubenswrapper[16352]: I0307 21:22:23.464040 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2aadcdd5-aa40-442c-9434-97f150dddf70" (UID: "2aadcdd5-aa40-442c-9434-97f150dddf70"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:23.464594 master-0 kubenswrapper[16352]: I0307 21:22:23.464563 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:23.464649 master-0 kubenswrapper[16352]: I0307 21:22:23.464600 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2aadcdd5-aa40-442c-9434-97f150dddf70-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:23.468518 master-0 kubenswrapper[16352]: I0307 21:22:23.468433 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2aadcdd5-aa40-442c-9434-97f150dddf70" (UID: "2aadcdd5-aa40-442c-9434-97f150dddf70"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:23.566906 master-0 kubenswrapper[16352]: I0307 21:22:23.566432 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2aadcdd5-aa40-442c-9434-97f150dddf70-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:23.808495 master-0 kubenswrapper[16352]: I0307 21:22:23.808373 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-84f57b9877-dwqg9"] Mar 07 21:22:23.823955 master-0 kubenswrapper[16352]: W0307 21:22:23.823870 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod59a192e8_491e_405e_955e_c293b335634d.slice/crio-4044a665eb292a3d3ba956938b5ba39828ad0879928a6649a77597c26ae6b858 WatchSource:0}: Error finding container 4044a665eb292a3d3ba956938b5ba39828ad0879928a6649a77597c26ae6b858: Status 404 returned error can't find the container with id 4044a665eb292a3d3ba956938b5ba39828ad0879928a6649a77597c26ae6b858 Mar 07 21:22:24.046960 master-0 kubenswrapper[16352]: I0307 21:22:24.046779 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" event={"ID":"2aadcdd5-aa40-442c-9434-97f150dddf70","Type":"ContainerDied","Data":"2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53"} Mar 07 21:22:24.046960 master-0 kubenswrapper[16352]: I0307 21:22:24.046886 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2cd11fd645ccc00a72bc62da05ad7ccff04a454a3604982122d7a82b3d1dda53" Mar 07 21:22:24.046960 master-0 kubenswrapper[16352]: I0307 21:22:24.046813 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-4-retry-1-master-0" Mar 07 21:22:24.048340 master-0 kubenswrapper[16352]: I0307 21:22:24.048286 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-dwqg9" event={"ID":"59a192e8-491e-405e-955e-c293b335634d","Type":"ContainerStarted","Data":"4044a665eb292a3d3ba956938b5ba39828ad0879928a6649a77597c26ae6b858"} Mar 07 21:22:24.055928 master-0 kubenswrapper[16352]: I0307 21:22:24.055895 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"95f51830d903c41c8ee7ab7a8d7de2c678711e9e11302d94e3d2db00f6dd7437"} Mar 07 21:22:24.056202 master-0 kubenswrapper[16352]: I0307 21:22:24.056146 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:22:24.067075 master-0 kubenswrapper[16352]: I0307 21:22:24.067006 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9"} Mar 07 21:22:24.067075 master-0 kubenswrapper[16352]: I0307 21:22:24.067079 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61"} Mar 07 21:22:24.067324 master-0 kubenswrapper[16352]: I0307 21:22:24.067102 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"2c0128d80fea5834fc0f12bf23cdfbeeabbf5c415717881d7c9c6db472d9dd3f"} Mar 07 21:22:24.090407 master-0 kubenswrapper[16352]: I0307 21:22:24.089292 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=4.089266925 podStartE2EDuration="4.089266925s" podCreationTimestamp="2026-03-07 21:22:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:22:24.087188115 +0000 UTC m=+267.157893174" watchObservedRunningTime="2026-03-07 21:22:24.089266925 +0000 UTC m=+267.159971984" Mar 07 21:22:24.125297 master-0 kubenswrapper[16352]: I0307 21:22:24.125156 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.1251209 podStartE2EDuration="2.1251209s" podCreationTimestamp="2026-03-07 21:22:22 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:22:24.119579336 +0000 UTC m=+267.190284435" watchObservedRunningTime="2026-03-07 21:22:24.1251209 +0000 UTC m=+267.195825969" Mar 07 21:22:24.480256 master-0 kubenswrapper[16352]: I0307 21:22:24.480209 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:22:24.488905 master-0 kubenswrapper[16352]: I0307 21:22:24.488837 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir\") pod \"2200306f-7816-4019-a6e1-5847ea5b51b1\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " Mar 07 21:22:24.489028 master-0 kubenswrapper[16352]: I0307 21:22:24.488986 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "2200306f-7816-4019-a6e1-5847ea5b51b1" (UID: "2200306f-7816-4019-a6e1-5847ea5b51b1"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:24.489095 master-0 kubenswrapper[16352]: I0307 21:22:24.489029 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access\") pod \"2200306f-7816-4019-a6e1-5847ea5b51b1\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " Mar 07 21:22:24.489134 master-0 kubenswrapper[16352]: I0307 21:22:24.489107 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock\") pod \"2200306f-7816-4019-a6e1-5847ea5b51b1\" (UID: \"2200306f-7816-4019-a6e1-5847ea5b51b1\") " Mar 07 21:22:24.489471 master-0 kubenswrapper[16352]: I0307 21:22:24.489290 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock" (OuterVolumeSpecName: "var-lock") pod "2200306f-7816-4019-a6e1-5847ea5b51b1" (UID: "2200306f-7816-4019-a6e1-5847ea5b51b1"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:24.490264 master-0 kubenswrapper[16352]: I0307 21:22:24.489738 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:24.490264 master-0 kubenswrapper[16352]: I0307 21:22:24.489812 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/2200306f-7816-4019-a6e1-5847ea5b51b1-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:24.494301 master-0 kubenswrapper[16352]: I0307 21:22:24.494262 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "2200306f-7816-4019-a6e1-5847ea5b51b1" (UID: "2200306f-7816-4019-a6e1-5847ea5b51b1"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:24.591283 master-0 kubenswrapper[16352]: I0307 21:22:24.591215 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2200306f-7816-4019-a6e1-5847ea5b51b1-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:25.083462 master-0 kubenswrapper[16352]: I0307 21:22:25.083391 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-3-master-0" Mar 07 21:22:25.084052 master-0 kubenswrapper[16352]: I0307 21:22:25.083801 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-3-master-0" event={"ID":"2200306f-7816-4019-a6e1-5847ea5b51b1","Type":"ContainerDied","Data":"1b0ee44965e1b527225d1cfb842b0cc111aef93935c0770d430b0d28fc3b7411"} Mar 07 21:22:25.084052 master-0 kubenswrapper[16352]: I0307 21:22:25.083888 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b0ee44965e1b527225d1cfb842b0cc111aef93935c0770d430b0d28fc3b7411" Mar 07 21:22:30.137822 master-0 kubenswrapper[16352]: I0307 21:22:30.136960 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-mmqbs_982319eb-2dc2-4faa-85d8-ee11840179fd/multus-admission-controller/0.log" Mar 07 21:22:30.137822 master-0 kubenswrapper[16352]: I0307 21:22:30.137031 16352 generic.go:334] "Generic (PLEG): container finished" podID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerID="d0c8f910f29b908238dbc63bf9ac7b0f87a9546eaf7538fe52110d4fc58afa92" exitCode=137 Mar 07 21:22:30.137822 master-0 kubenswrapper[16352]: I0307 21:22:30.137071 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerDied","Data":"d0c8f910f29b908238dbc63bf9ac7b0f87a9546eaf7538fe52110d4fc58afa92"} Mar 07 21:22:30.709443 master-0 kubenswrapper[16352]: I0307 21:22:30.709385 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-mmqbs_982319eb-2dc2-4faa-85d8-ee11840179fd/multus-admission-controller/0.log" Mar 07 21:22:30.709709 master-0 kubenswrapper[16352]: I0307 21:22:30.709505 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:22:30.823543 master-0 kubenswrapper[16352]: I0307 21:22:30.823457 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") pod \"982319eb-2dc2-4faa-85d8-ee11840179fd\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " Mar 07 21:22:30.823819 master-0 kubenswrapper[16352]: I0307 21:22:30.823794 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") pod \"982319eb-2dc2-4faa-85d8-ee11840179fd\" (UID: \"982319eb-2dc2-4faa-85d8-ee11840179fd\") " Mar 07 21:22:30.828164 master-0 kubenswrapper[16352]: I0307 21:22:30.827984 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj" (OuterVolumeSpecName: "kube-api-access-9rkvj") pod "982319eb-2dc2-4faa-85d8-ee11840179fd" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd"). InnerVolumeSpecName "kube-api-access-9rkvj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:30.829934 master-0 kubenswrapper[16352]: I0307 21:22:30.829885 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "982319eb-2dc2-4faa-85d8-ee11840179fd" (UID: "982319eb-2dc2-4faa-85d8-ee11840179fd"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:22:30.926326 master-0 kubenswrapper[16352]: I0307 21:22:30.926105 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rkvj\" (UniqueName: \"kubernetes.io/projected/982319eb-2dc2-4faa-85d8-ee11840179fd-kube-api-access-9rkvj\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:30.926326 master-0 kubenswrapper[16352]: I0307 21:22:30.926240 16352 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/982319eb-2dc2-4faa-85d8-ee11840179fd-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:31.159559 master-0 kubenswrapper[16352]: I0307 21:22:31.158040 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-8d675b596-mmqbs_982319eb-2dc2-4faa-85d8-ee11840179fd/multus-admission-controller/0.log" Mar 07 21:22:31.159559 master-0 kubenswrapper[16352]: I0307 21:22:31.158161 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" event={"ID":"982319eb-2dc2-4faa-85d8-ee11840179fd","Type":"ContainerDied","Data":"54a20e1f511152c3ce1af3a4ee865982dc446fb94c7eea743ba8661a12deba25"} Mar 07 21:22:31.159559 master-0 kubenswrapper[16352]: I0307 21:22:31.158249 16352 scope.go:117] "RemoveContainer" containerID="3a5ae5606c3fe49b9c95657bc133ba344c9c2ef5cc32f8c9971a5b271f1840f6" Mar 07 21:22:31.159559 master-0 kubenswrapper[16352]: I0307 21:22:31.158259 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-8d675b596-mmqbs" Mar 07 21:22:31.182948 master-0 kubenswrapper[16352]: I0307 21:22:31.182891 16352 scope.go:117] "RemoveContainer" containerID="d0c8f910f29b908238dbc63bf9ac7b0f87a9546eaf7538fe52110d4fc58afa92" Mar 07 21:22:31.203312 master-0 kubenswrapper[16352]: I0307 21:22:31.203217 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:22:31.207356 master-0 kubenswrapper[16352]: I0307 21:22:31.207276 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-8d675b596-mmqbs"] Mar 07 21:22:32.509204 master-0 kubenswrapper[16352]: I0307 21:22:32.509093 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:32.509204 master-0 kubenswrapper[16352]: I0307 21:22:32.509191 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:32.509204 master-0 kubenswrapper[16352]: I0307 21:22:32.509206 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:32.512263 master-0 kubenswrapper[16352]: I0307 21:22:32.509282 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:32.514382 master-0 kubenswrapper[16352]: I0307 21:22:32.514333 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:32.517616 master-0 kubenswrapper[16352]: I0307 21:22:32.517523 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:33.182900 master-0 kubenswrapper[16352]: I0307 21:22:33.182793 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:33.185163 master-0 kubenswrapper[16352]: I0307 21:22:33.185079 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:22:33.207396 master-0 kubenswrapper[16352]: I0307 21:22:33.207345 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" path="/var/lib/kubelet/pods/982319eb-2dc2-4faa-85d8-ee11840179fd/volumes" Mar 07 21:22:33.787242 master-0 kubenswrapper[16352]: E0307 21:22:33.787108 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[alertmanager-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" Mar 07 21:22:34.188925 master-0 kubenswrapper[16352]: I0307 21:22:34.188234 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:22:34.357836 master-0 kubenswrapper[16352]: I0307 21:22:34.357768 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:34.358398 master-0 kubenswrapper[16352]: I0307 21:22:34.358359 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-2-master-0" podUID="c311d423-1179-467a-a50c-3e38e5d6e5ed" containerName="installer" containerID="cri-o://0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59" gracePeriod=30 Mar 07 21:22:37.570675 master-0 kubenswrapper[16352]: I0307 21:22:37.570586 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:22:37.572781 master-0 kubenswrapper[16352]: I0307 21:22:37.572719 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:22:37.792850 master-0 kubenswrapper[16352]: I0307 21:22:37.792788 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-7bx66" Mar 07 21:22:37.800194 master-0 kubenswrapper[16352]: I0307 21:22:37.800108 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:22:38.249337 master-0 kubenswrapper[16352]: I0307 21:22:38.249271 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:22:38.830904 master-0 kubenswrapper[16352]: E0307 21:22:38.830614 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[prometheus-trusted-ca-bundle], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" Mar 07 21:22:38.959814 master-0 kubenswrapper[16352]: I0307 21:22:38.959745 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:22:38.960123 master-0 kubenswrapper[16352]: E0307 21:22:38.960110 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="multus-admission-controller" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: I0307 21:22:38.960127 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="multus-admission-controller" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: E0307 21:22:38.960150 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="kube-rbac-proxy" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: I0307 21:22:38.960159 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="kube-rbac-proxy" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: E0307 21:22:38.960174 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2200306f-7816-4019-a6e1-5847ea5b51b1" containerName="installer" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: I0307 21:22:38.960182 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2200306f-7816-4019-a6e1-5847ea5b51b1" containerName="installer" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: E0307 21:22:38.960201 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aadcdd5-aa40-442c-9434-97f150dddf70" containerName="installer" Mar 07 21:22:38.960205 master-0 kubenswrapper[16352]: I0307 21:22:38.960209 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aadcdd5-aa40-442c-9434-97f150dddf70" containerName="installer" Mar 07 21:22:38.960654 master-0 kubenswrapper[16352]: I0307 21:22:38.960354 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="multus-admission-controller" Mar 07 21:22:38.960654 master-0 kubenswrapper[16352]: I0307 21:22:38.960377 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2200306f-7816-4019-a6e1-5847ea5b51b1" containerName="installer" Mar 07 21:22:38.960654 master-0 kubenswrapper[16352]: I0307 21:22:38.960393 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aadcdd5-aa40-442c-9434-97f150dddf70" containerName="installer" Mar 07 21:22:38.960654 master-0 kubenswrapper[16352]: I0307 21:22:38.960412 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="982319eb-2dc2-4faa-85d8-ee11840179fd" containerName="kube-rbac-proxy" Mar 07 21:22:38.961050 master-0 kubenswrapper[16352]: I0307 21:22:38.961001 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:38.981098 master-0 kubenswrapper[16352]: I0307 21:22:38.981041 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:22:39.005563 master-0 kubenswrapper[16352]: I0307 21:22:38.999712 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.005563 master-0 kubenswrapper[16352]: I0307 21:22:38.999796 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.005563 master-0 kubenswrapper[16352]: I0307 21:22:39.000060 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.101979 master-0 kubenswrapper[16352]: I0307 21:22:39.101903 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.102211 master-0 kubenswrapper[16352]: I0307 21:22:39.102145 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.102211 master-0 kubenswrapper[16352]: I0307 21:22:39.102197 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.102331 master-0 kubenswrapper[16352]: I0307 21:22:39.102305 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.102386 master-0 kubenswrapper[16352]: I0307 21:22:39.102354 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.123060 master-0 kubenswrapper[16352]: I0307 21:22:39.123010 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access\") pod \"installer-3-master-0\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.240882 master-0 kubenswrapper[16352]: I0307 21:22:39.240815 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="2a0b5efee8d9bea443d0c78f75b1bbd14c05bdb0d02fbf32cc6350f09b3b5043" exitCode=0 Mar 07 21:22:39.241138 master-0 kubenswrapper[16352]: I0307 21:22:39.240916 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"2a0b5efee8d9bea443d0c78f75b1bbd14c05bdb0d02fbf32cc6350f09b3b5043"} Mar 07 21:22:39.241138 master-0 kubenswrapper[16352]: I0307 21:22:39.240977 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:22:39.241138 master-0 kubenswrapper[16352]: I0307 21:22:39.240999 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"f35b2f615a6cdc98ab3a5e215d01153cd89ba8361e71f5dc88ede17f7c042fc2"} Mar 07 21:22:39.284798 master-0 kubenswrapper[16352]: I0307 21:22:39.284740 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:22:39.713170 master-0 kubenswrapper[16352]: I0307 21:22:39.713023 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:22:39.721292 master-0 kubenswrapper[16352]: W0307 21:22:39.719549 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc34534f8_0d38_40a8_a28c_11c20ce64353.slice/crio-ad3ef05c4fcf71fd72985b9596a198bcc59854b6714fb20c4ba41f018577c4dd WatchSource:0}: Error finding container ad3ef05c4fcf71fd72985b9596a198bcc59854b6714fb20c4ba41f018577c4dd: Status 404 returned error can't find the container with id ad3ef05c4fcf71fd72985b9596a198bcc59854b6714fb20c4ba41f018577c4dd Mar 07 21:22:40.251648 master-0 kubenswrapper[16352]: I0307 21:22:40.251566 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"c34534f8-0d38-40a8-a28c-11c20ce64353","Type":"ContainerStarted","Data":"ad3ef05c4fcf71fd72985b9596a198bcc59854b6714fb20c4ba41f018577c4dd"} Mar 07 21:22:41.261855 master-0 kubenswrapper[16352]: I0307 21:22:41.261783 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"8990d92d7314bd8c6c9472dce3afd8d9a5d4579a2e23086a7bfdf4b6e779d5de"} Mar 07 21:22:41.263617 master-0 kubenswrapper[16352]: I0307 21:22:41.263554 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"c34534f8-0d38-40a8-a28c-11c20ce64353","Type":"ContainerStarted","Data":"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990"} Mar 07 21:22:41.289001 master-0 kubenswrapper[16352]: I0307 21:22:41.288890 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-3-master-0" podStartSLOduration=3.288866049 podStartE2EDuration="3.288866049s" podCreationTimestamp="2026-03-07 21:22:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:22:41.286482902 +0000 UTC m=+284.357187961" watchObservedRunningTime="2026-03-07 21:22:41.288866049 +0000 UTC m=+284.359571108" Mar 07 21:22:42.277603 master-0 kubenswrapper[16352]: I0307 21:22:42.277412 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"df0305c3b23c7fc5354180436d83231087c4a341c296678d6e5b76e30d0f72c4"} Mar 07 21:22:42.277603 master-0 kubenswrapper[16352]: I0307 21:22:42.277595 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"e4adb18ca8c14077d35457f3a2cf7fcda5afb5906a69465cfa0e4c206ff04578"} Mar 07 21:22:42.277603 master-0 kubenswrapper[16352]: I0307 21:22:42.277615 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"7d4a3171cf827d2bd252bd67d3527faaeb48ec4ad32b82c909c0de55f87057fa"} Mar 07 21:22:42.277603 master-0 kubenswrapper[16352]: I0307 21:22:42.277631 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"8b7b3e87871f954bd434780d4486a3763a32e7e97d8bda5a8f30d82a22dc54fa"} Mar 07 21:22:42.278488 master-0 kubenswrapper[16352]: I0307 21:22:42.277648 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerStarted","Data":"a53a9a1071ac13a88d1951a193326d32b83a1c1780abad0e0dafbb3804cc8bca"} Mar 07 21:22:42.315474 master-0 kubenswrapper[16352]: I0307 21:22:42.315310 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=253.628195195 podStartE2EDuration="4m15.315281557s" podCreationTimestamp="2026-03-07 21:18:27 +0000 UTC" firstStartedPulling="2026-03-07 21:22:39.246216291 +0000 UTC m=+282.316921340" lastFinishedPulling="2026-03-07 21:22:40.933302643 +0000 UTC m=+284.004007702" observedRunningTime="2026-03-07 21:22:42.308442023 +0000 UTC m=+285.379147092" watchObservedRunningTime="2026-03-07 21:22:42.315281557 +0000 UTC m=+285.385986626" Mar 07 21:22:42.773102 master-0 kubenswrapper[16352]: I0307 21:22:42.773021 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:22:42.774921 master-0 kubenswrapper[16352]: I0307 21:22:42.774878 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:22:42.845782 master-0 kubenswrapper[16352]: I0307 21:22:42.845723 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-l4g8x" Mar 07 21:22:42.855248 master-0 kubenswrapper[16352]: I0307 21:22:42.855105 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:22:42.925432 master-0 kubenswrapper[16352]: I0307 21:22:42.925314 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c311d423-1179-467a-a50c-3e38e5d6e5ed/installer/0.log" Mar 07 21:22:42.925432 master-0 kubenswrapper[16352]: I0307 21:22:42.925441 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:43.078119 master-0 kubenswrapper[16352]: I0307 21:22:43.078051 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock\") pod \"c311d423-1179-467a-a50c-3e38e5d6e5ed\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " Mar 07 21:22:43.078448 master-0 kubenswrapper[16352]: I0307 21:22:43.078172 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir\") pod \"c311d423-1179-467a-a50c-3e38e5d6e5ed\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " Mar 07 21:22:43.078448 master-0 kubenswrapper[16352]: I0307 21:22:43.078221 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock" (OuterVolumeSpecName: "var-lock") pod "c311d423-1179-467a-a50c-3e38e5d6e5ed" (UID: "c311d423-1179-467a-a50c-3e38e5d6e5ed"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:43.078448 master-0 kubenswrapper[16352]: I0307 21:22:43.078261 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access\") pod \"c311d423-1179-467a-a50c-3e38e5d6e5ed\" (UID: \"c311d423-1179-467a-a50c-3e38e5d6e5ed\") " Mar 07 21:22:43.078448 master-0 kubenswrapper[16352]: I0307 21:22:43.078404 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c311d423-1179-467a-a50c-3e38e5d6e5ed" (UID: "c311d423-1179-467a-a50c-3e38e5d6e5ed"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:22:43.079167 master-0 kubenswrapper[16352]: I0307 21:22:43.079114 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:43.079167 master-0 kubenswrapper[16352]: I0307 21:22:43.079142 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c311d423-1179-467a-a50c-3e38e5d6e5ed-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:43.082701 master-0 kubenswrapper[16352]: I0307 21:22:43.082644 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c311d423-1179-467a-a50c-3e38e5d6e5ed" (UID: "c311d423-1179-467a-a50c-3e38e5d6e5ed"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:22:43.181489 master-0 kubenswrapper[16352]: I0307 21:22:43.181243 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c311d423-1179-467a-a50c-3e38e5d6e5ed-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:22:43.293537 master-0 kubenswrapper[16352]: I0307 21:22:43.293476 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-2-master-0_c311d423-1179-467a-a50c-3e38e5d6e5ed/installer/0.log" Mar 07 21:22:43.294269 master-0 kubenswrapper[16352]: I0307 21:22:43.293571 16352 generic.go:334] "Generic (PLEG): container finished" podID="c311d423-1179-467a-a50c-3e38e5d6e5ed" containerID="0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59" exitCode=1 Mar 07 21:22:43.297398 master-0 kubenswrapper[16352]: I0307 21:22:43.297338 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-2-master-0" Mar 07 21:22:43.298591 master-0 kubenswrapper[16352]: I0307 21:22:43.298548 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c311d423-1179-467a-a50c-3e38e5d6e5ed","Type":"ContainerDied","Data":"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59"} Mar 07 21:22:43.298675 master-0 kubenswrapper[16352]: I0307 21:22:43.298614 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-2-master-0" event={"ID":"c311d423-1179-467a-a50c-3e38e5d6e5ed","Type":"ContainerDied","Data":"16b6a716cff70f743c159337f640fae48c79962073b4c024c5cc09d587c4526e"} Mar 07 21:22:43.298763 master-0 kubenswrapper[16352]: I0307 21:22:43.298656 16352 scope.go:117] "RemoveContainer" containerID="0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59" Mar 07 21:22:43.329488 master-0 kubenswrapper[16352]: I0307 21:22:43.329414 16352 scope.go:117] "RemoveContainer" containerID="0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59" Mar 07 21:22:43.330154 master-0 kubenswrapper[16352]: E0307 21:22:43.330105 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59\": container with ID starting with 0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59 not found: ID does not exist" containerID="0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59" Mar 07 21:22:43.330235 master-0 kubenswrapper[16352]: I0307 21:22:43.330173 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59"} err="failed to get container status \"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59\": rpc error: code = NotFound desc = could not find container \"0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59\": container with ID starting with 0f25db4f9101c7b7d6294d4ae5042c1479a1d0aa096ef94a14abcc1ac2252e59 not found: ID does not exist" Mar 07 21:22:43.341897 master-0 kubenswrapper[16352]: I0307 21:22:43.341831 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:43.349114 master-0 kubenswrapper[16352]: I0307 21:22:43.349037 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-2-master-0"] Mar 07 21:22:43.351847 master-0 kubenswrapper[16352]: W0307 21:22:43.351786 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ca1461_37ed_4e6b_a289_9f3249d52a24.slice/crio-0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca WatchSource:0}: Error finding container 0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca: Status 404 returned error can't find the container with id 0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca Mar 07 21:22:43.352561 master-0 kubenswrapper[16352]: I0307 21:22:43.352488 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:22:44.305843 master-0 kubenswrapper[16352]: I0307 21:22:44.305536 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="5a22995b60cfc534ab9fd6af1093820ee79bf42f1f476f345c1e976f3a3fc80b" exitCode=0 Mar 07 21:22:44.305843 master-0 kubenswrapper[16352]: I0307 21:22:44.305668 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"5a22995b60cfc534ab9fd6af1093820ee79bf42f1f476f345c1e976f3a3fc80b"} Mar 07 21:22:44.305843 master-0 kubenswrapper[16352]: I0307 21:22:44.305755 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca"} Mar 07 21:22:45.202854 master-0 kubenswrapper[16352]: I0307 21:22:45.200972 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c311d423-1179-467a-a50c-3e38e5d6e5ed" path="/var/lib/kubelet/pods/c311d423-1179-467a-a50c-3e38e5d6e5ed/volumes" Mar 07 21:22:48.197815 master-0 kubenswrapper[16352]: I0307 21:22:48.196675 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:22:48.197815 master-0 kubenswrapper[16352]: E0307 21:22:48.197156 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c311d423-1179-467a-a50c-3e38e5d6e5ed" containerName="installer" Mar 07 21:22:48.197815 master-0 kubenswrapper[16352]: I0307 21:22:48.197177 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c311d423-1179-467a-a50c-3e38e5d6e5ed" containerName="installer" Mar 07 21:22:48.197815 master-0 kubenswrapper[16352]: I0307 21:22:48.197391 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c311d423-1179-467a-a50c-3e38e5d6e5ed" containerName="installer" Mar 07 21:22:48.198572 master-0 kubenswrapper[16352]: I0307 21:22:48.198115 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.204380 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-psq7z" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.204642 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.204801 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.204975 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.205089 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 21:22:48.205760 master-0 kubenswrapper[16352]: I0307 21:22:48.205256 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.205896 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.205930 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.205990 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmkfj\" (UniqueName: \"kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.206014 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.206033 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.206068 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.206109 master-0 kubenswrapper[16352]: I0307 21:22:48.206093 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.231532 master-0 kubenswrapper[16352]: I0307 21:22:48.225856 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 21:22:48.239330 master-0 kubenswrapper[16352]: I0307 21:22:48.238417 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:22:48.275099 master-0 kubenswrapper[16352]: I0307 21:22:48.275015 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:22:48.275493 master-0 kubenswrapper[16352]: I0307 21:22:48.275397 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerName="route-controller-manager" containerID="cri-o://e28998f60449f58259c0cdb625118f7b6c9387b10abccbf9ef475bb39dbd3f74" gracePeriod=30 Mar 07 21:22:48.300141 master-0 kubenswrapper[16352]: I0307 21:22:48.298879 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308471 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308562 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308637 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmkfj\" (UniqueName: \"kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308695 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308748 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308811 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.308860 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.310314 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.311294 master-0 kubenswrapper[16352]: I0307 21:22:48.310595 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.318711 master-0 kubenswrapper[16352]: I0307 21:22:48.312443 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.318711 master-0 kubenswrapper[16352]: I0307 21:22:48.313823 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.324712 master-0 kubenswrapper[16352]: I0307 21:22:48.319592 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:22:48.324712 master-0 kubenswrapper[16352]: I0307 21:22:48.319962 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" containerName="controller-manager" containerID="cri-o://ad261fabb7ddabed91944dcee1de6f4489253aa6b0b8c94f1078f8b07e107a86" gracePeriod=30 Mar 07 21:22:48.324712 master-0 kubenswrapper[16352]: I0307 21:22:48.320374 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.330712 master-0 kubenswrapper[16352]: I0307 21:22:48.326333 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.358584 master-0 kubenswrapper[16352]: I0307 21:22:48.358525 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmkfj\" (UniqueName: \"kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj\") pod \"console-64d844fb5f-9b28j\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:48.575344 master-0 kubenswrapper[16352]: I0307 21:22:48.575220 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:22:50.015059 master-0 kubenswrapper[16352]: I0307 21:22:50.014902 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 07 21:22:50.016419 master-0 kubenswrapper[16352]: I0307 21:22:50.016392 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.018955 master-0 kubenswrapper[16352]: I0307 21:22:50.018274 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd"/"kube-root-ca.crt" Mar 07 21:22:50.021350 master-0 kubenswrapper[16352]: I0307 21:22:50.021325 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd"/"installer-sa-dockercfg-bz4t2" Mar 07 21:22:50.025464 master-0 kubenswrapper[16352]: I0307 21:22:50.025443 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 07 21:22:50.152856 master-0 kubenswrapper[16352]: I0307 21:22:50.152767 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.153139 master-0 kubenswrapper[16352]: I0307 21:22:50.153077 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.155127 master-0 kubenswrapper[16352]: I0307 21:22:50.153280 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.255711 master-0 kubenswrapper[16352]: I0307 21:22:50.255561 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.256086 master-0 kubenswrapper[16352]: I0307 21:22:50.255752 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.256086 master-0 kubenswrapper[16352]: I0307 21:22:50.255768 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.256086 master-0 kubenswrapper[16352]: I0307 21:22:50.255939 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.256086 master-0 kubenswrapper[16352]: I0307 21:22:50.255957 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.273741 master-0 kubenswrapper[16352]: I0307 21:22:50.273632 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access\") pod \"installer-2-master-0\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:50.347381 master-0 kubenswrapper[16352]: I0307 21:22:50.347330 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 07 21:22:53.411936 master-0 kubenswrapper[16352]: I0307 21:22:53.411858 16352 patch_prober.go:28] interesting pod/route-controller-manager-cdf659ffc-4969h container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" start-of-body= Mar 07 21:22:53.412748 master-0 kubenswrapper[16352]: I0307 21:22:53.411947 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.128.0.53:8443/healthz\": dial tcp 10.128.0.53:8443: connect: connection refused" Mar 07 21:22:55.065452 master-0 kubenswrapper[16352]: I0307 21:22:55.065377 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 07 21:22:55.066571 master-0 kubenswrapper[16352]: I0307 21:22:55.066547 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.068267 master-0 kubenswrapper[16352]: I0307 21:22:55.068237 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-8cblb" Mar 07 21:22:55.069179 master-0 kubenswrapper[16352]: I0307 21:22:55.069153 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 07 21:22:55.078006 master-0 kubenswrapper[16352]: I0307 21:22:55.077950 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 07 21:22:55.253860 master-0 kubenswrapper[16352]: I0307 21:22:55.253747 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.253860 master-0 kubenswrapper[16352]: I0307 21:22:55.253869 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.254166 master-0 kubenswrapper[16352]: I0307 21:22:55.254072 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.355893 master-0 kubenswrapper[16352]: I0307 21:22:55.355811 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.356283 master-0 kubenswrapper[16352]: I0307 21:22:55.355935 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.356283 master-0 kubenswrapper[16352]: I0307 21:22:55.355941 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.356283 master-0 kubenswrapper[16352]: I0307 21:22:55.356032 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.356283 master-0 kubenswrapper[16352]: I0307 21:22:55.356227 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.375006 master-0 kubenswrapper[16352]: I0307 21:22:55.374011 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access\") pod \"installer-5-master-0\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.394170 master-0 kubenswrapper[16352]: I0307 21:22:55.394040 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:22:55.741991 master-0 kubenswrapper[16352]: I0307 21:22:55.741803 16352 patch_prober.go:28] interesting pod/controller-manager-86d86fcf49-hgbkg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" start-of-body= Mar 07 21:22:55.741991 master-0 kubenswrapper[16352]: I0307 21:22:55.741901 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" containerName="controller-manager" probeResult="failure" output="Get \"https://10.128.0.52:8443/healthz\": dial tcp 10.128.0.52:8443: connect: connection refused" Mar 07 21:22:57.388899 master-0 kubenswrapper[16352]: I0307 21:22:57.388830 16352 scope.go:117] "RemoveContainer" containerID="280e10e4ead7199cb4e5eb06d68976c14126e54c3ec3e9d229c33b8faed6eeb7" Mar 07 21:22:58.362473 master-0 kubenswrapper[16352]: I0307 21:22:58.362382 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:22:58.362832 master-0 kubenswrapper[16352]: I0307 21:22:58.362720 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/installer-3-master-0" podUID="c34534f8-0d38-40a8-a28c-11c20ce64353" containerName="installer" containerID="cri-o://63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990" gracePeriod=30 Mar 07 21:22:58.477571 master-0 kubenswrapper[16352]: I0307 21:22:58.477475 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 07 21:22:58.479163 master-0 kubenswrapper[16352]: I0307 21:22:58.479115 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.483963 master-0 kubenswrapper[16352]: I0307 21:22:58.483867 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 21:22:58.486219 master-0 kubenswrapper[16352]: I0307 21:22:58.486184 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-xlnsg" Mar 07 21:22:58.489943 master-0 kubenswrapper[16352]: I0307 21:22:58.489871 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 07 21:22:58.630149 master-0 kubenswrapper[16352]: I0307 21:22:58.630024 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.630947 master-0 kubenswrapper[16352]: I0307 21:22:58.630872 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.631048 master-0 kubenswrapper[16352]: I0307 21:22:58.631014 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.733954 master-0 kubenswrapper[16352]: I0307 21:22:58.733889 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.733954 master-0 kubenswrapper[16352]: I0307 21:22:58.733959 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.734287 master-0 kubenswrapper[16352]: I0307 21:22:58.734014 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.734287 master-0 kubenswrapper[16352]: I0307 21:22:58.734061 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.734287 master-0 kubenswrapper[16352]: I0307 21:22:58.734281 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.752876 master-0 kubenswrapper[16352]: I0307 21:22:58.752826 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access\") pod \"installer-4-master-0\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:22:58.800067 master-0 kubenswrapper[16352]: I0307 21:22:58.799981 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:23:01.511654 master-0 kubenswrapper[16352]: I0307 21:23:01.511572 16352 generic.go:334] "Generic (PLEG): container finished" podID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerID="e28998f60449f58259c0cdb625118f7b6c9387b10abccbf9ef475bb39dbd3f74" exitCode=0 Mar 07 21:23:01.512519 master-0 kubenswrapper[16352]: I0307 21:23:01.511762 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" event={"ID":"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907","Type":"ContainerDied","Data":"e28998f60449f58259c0cdb625118f7b6c9387b10abccbf9ef475bb39dbd3f74"} Mar 07 21:23:01.765800 master-0 kubenswrapper[16352]: I0307 21:23:01.763728 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 07 21:23:01.766488 master-0 kubenswrapper[16352]: I0307 21:23:01.766209 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:01.786509 master-0 kubenswrapper[16352]: I0307 21:23:01.786444 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 07 21:23:01.925415 master-0 kubenswrapper[16352]: I0307 21:23:01.925309 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:01.925718 master-0 kubenswrapper[16352]: I0307 21:23:01.925480 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:01.925718 master-0 kubenswrapper[16352]: I0307 21:23:01.925595 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.026865 master-0 kubenswrapper[16352]: I0307 21:23:02.026389 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.026865 master-0 kubenswrapper[16352]: I0307 21:23:02.026477 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.026865 master-0 kubenswrapper[16352]: I0307 21:23:02.026644 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.026865 master-0 kubenswrapper[16352]: I0307 21:23:02.026815 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.027523 master-0 kubenswrapper[16352]: I0307 21:23:02.026917 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.046039 master-0 kubenswrapper[16352]: I0307 21:23:02.045957 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access\") pod \"installer-4-master-0\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.158647 master-0 kubenswrapper[16352]: I0307 21:23:02.158520 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:23:02.247361 master-0 kubenswrapper[16352]: I0307 21:23:02.247281 16352 scope.go:117] "RemoveContainer" containerID="554ffc5919fe7a46fc0ad2b26594bc2dec62e5f792ce74d74fe8d549af25bf01" Mar 07 21:23:02.335169 master-0 kubenswrapper[16352]: I0307 21:23:02.335073 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:23:02.382009 master-0 kubenswrapper[16352]: I0307 21:23:02.381914 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk"] Mar 07 21:23:02.382559 master-0 kubenswrapper[16352]: E0307 21:23:02.382523 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerName="route-controller-manager" Mar 07 21:23:02.382559 master-0 kubenswrapper[16352]: I0307 21:23:02.382552 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerName="route-controller-manager" Mar 07 21:23:02.382783 master-0 kubenswrapper[16352]: I0307 21:23:02.382757 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" containerName="route-controller-manager" Mar 07 21:23:02.383308 master-0 kubenswrapper[16352]: I0307 21:23:02.383279 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.417369 master-0 kubenswrapper[16352]: I0307 21:23:02.410706 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk"] Mar 07 21:23:02.534000 master-0 kubenswrapper[16352]: I0307 21:23:02.533950 16352 kubelet.go:1505] "Image garbage collection succeeded" Mar 07 21:23:02.543908 master-0 kubenswrapper[16352]: I0307 21:23:02.542507 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") pod \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " Mar 07 21:23:02.546315 master-0 kubenswrapper[16352]: I0307 21:23:02.544045 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") pod \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " Mar 07 21:23:02.551580 master-0 kubenswrapper[16352]: I0307 21:23:02.551520 16352 generic.go:334] "Generic (PLEG): container finished" podID="6deed9a9-6702-4177-a35d-58ad9930a893" containerID="ad261fabb7ddabed91944dcee1de6f4489253aa6b0b8c94f1078f8b07e107a86" exitCode=0 Mar 07 21:23:02.551666 master-0 kubenswrapper[16352]: I0307 21:23:02.551617 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" event={"ID":"6deed9a9-6702-4177-a35d-58ad9930a893","Type":"ContainerDied","Data":"ad261fabb7ddabed91944dcee1de6f4489253aa6b0b8c94f1078f8b07e107a86"} Mar 07 21:23:02.552500 master-0 kubenswrapper[16352]: I0307 21:23:02.552447 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm" (OuterVolumeSpecName: "kube-api-access-vvzbm") pod "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907"). InnerVolumeSpecName "kube-api-access-vvzbm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.555983 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.555978 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h" event={"ID":"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907","Type":"ContainerDied","Data":"59fb206093956750cd2b0971ba9daf6182e197e8af3331245cd46cb229bb1de1"} Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.556163 16352 scope.go:117] "RemoveContainer" containerID="e28998f60449f58259c0cdb625118f7b6c9387b10abccbf9ef475bb39dbd3f74" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558002 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") pod \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558035 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") pod \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\" (UID: \"7bac1b9e-53bc-46e9-ba12-2eb0f2d09907\") " Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558315 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-plqsz\" (UniqueName: \"kubernetes.io/projected/f7126852-ef72-4d71-bb9b-b22cb6935adf-kube-api-access-plqsz\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558425 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7126852-ef72-4d71-bb9b-b22cb6935adf-serving-cert\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558447 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-client-ca\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558522 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-config\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558635 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vvzbm\" (UniqueName: \"kubernetes.io/projected/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-kube-api-access-vvzbm\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558656 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca" (OuterVolumeSpecName: "client-ca") pod "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:02.558800 master-0 kubenswrapper[16352]: I0307 21:23:02.558800 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config" (OuterVolumeSpecName: "config") pod "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:02.574346 master-0 kubenswrapper[16352]: I0307 21:23:02.574279 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" (UID: "7bac1b9e-53bc-46e9-ba12-2eb0f2d09907"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:02.615634 master-0 kubenswrapper[16352]: I0307 21:23:02.615468 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660313 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") pod \"6deed9a9-6702-4177-a35d-58ad9930a893\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660462 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") pod \"6deed9a9-6702-4177-a35d-58ad9930a893\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660548 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") pod \"6deed9a9-6702-4177-a35d-58ad9930a893\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660602 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") pod \"6deed9a9-6702-4177-a35d-58ad9930a893\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660635 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") pod \"6deed9a9-6702-4177-a35d-58ad9930a893\" (UID: \"6deed9a9-6702-4177-a35d-58ad9930a893\") " Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660845 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-plqsz\" (UniqueName: \"kubernetes.io/projected/f7126852-ef72-4d71-bb9b-b22cb6935adf-kube-api-access-plqsz\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660923 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7126852-ef72-4d71-bb9b-b22cb6935adf-serving-cert\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.660950 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-client-ca\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.661402 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6deed9a9-6702-4177-a35d-58ad9930a893" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.661949 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca" (OuterVolumeSpecName: "client-ca") pod "6deed9a9-6702-4177-a35d-58ad9930a893" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.662139 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config" (OuterVolumeSpecName: "config") pod "6deed9a9-6702-4177-a35d-58ad9930a893" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.662915 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-config\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663263 16352 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663296 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663308 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663330 16352 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-proxy-ca-bundles\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663347 16352 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663361 16352 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6deed9a9-6702-4177-a35d-58ad9930a893-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.663636 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-client-ca\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666008 master-0 kubenswrapper[16352]: I0307 21:23:02.664416 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f7126852-ef72-4d71-bb9b-b22cb6935adf-config\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.666853 master-0 kubenswrapper[16352]: I0307 21:23:02.666165 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6deed9a9-6702-4177-a35d-58ad9930a893" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:02.666853 master-0 kubenswrapper[16352]: I0307 21:23:02.666243 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66" (OuterVolumeSpecName: "kube-api-access-lzr66") pod "6deed9a9-6702-4177-a35d-58ad9930a893" (UID: "6deed9a9-6702-4177-a35d-58ad9930a893"). InnerVolumeSpecName "kube-api-access-lzr66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:23:02.679757 master-0 kubenswrapper[16352]: I0307 21:23:02.676603 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f7126852-ef72-4d71-bb9b-b22cb6935adf-serving-cert\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.681671 master-0 kubenswrapper[16352]: I0307 21:23:02.681637 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-plqsz\" (UniqueName: \"kubernetes.io/projected/f7126852-ef72-4d71-bb9b-b22cb6935adf-kube-api-access-plqsz\") pod \"route-controller-manager-6d8686f75f-9t2lk\" (UID: \"f7126852-ef72-4d71-bb9b-b22cb6935adf\") " pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.713010 master-0 kubenswrapper[16352]: I0307 21:23:02.712966 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:02.772198 master-0 kubenswrapper[16352]: I0307 21:23:02.772139 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzr66\" (UniqueName: \"kubernetes.io/projected/6deed9a9-6702-4177-a35d-58ad9930a893-kube-api-access-lzr66\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.772301 master-0 kubenswrapper[16352]: I0307 21:23:02.772241 16352 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6deed9a9-6702-4177-a35d-58ad9930a893-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:02.910246 master-0 kubenswrapper[16352]: I0307 21:23:02.910177 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:23:02.913500 master-0 kubenswrapper[16352]: I0307 21:23:02.913438 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-cdf659ffc-4969h"] Mar 07 21:23:02.955648 master-0 kubenswrapper[16352]: I0307 21:23:02.952501 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd/installer-2-master-0"] Mar 07 21:23:03.064084 master-0 kubenswrapper[16352]: I0307 21:23:03.064023 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-4-master-0"] Mar 07 21:23:03.068243 master-0 kubenswrapper[16352]: I0307 21:23:03.068206 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-master-0"] Mar 07 21:23:03.069386 master-0 kubenswrapper[16352]: I0307 21:23:03.069358 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:23:03.071642 master-0 kubenswrapper[16352]: I0307 21:23:03.071554 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-master-0"] Mar 07 21:23:03.076142 master-0 kubenswrapper[16352]: W0307 21:23:03.076097 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod72b4d517_f9c1_4fb2_9217_bd02b6838b07.slice/crio-f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df WatchSource:0}: Error finding container f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df: Status 404 returned error can't find the container with id f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df Mar 07 21:23:03.207707 master-0 kubenswrapper[16352]: I0307 21:23:03.204235 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bac1b9e-53bc-46e9-ba12-2eb0f2d09907" path="/var/lib/kubelet/pods/7bac1b9e-53bc-46e9-ba12-2eb0f2d09907/volumes" Mar 07 21:23:03.275369 master-0 kubenswrapper[16352]: I0307 21:23:03.275284 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk"] Mar 07 21:23:03.571636 master-0 kubenswrapper[16352]: I0307 21:23:03.571591 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" Mar 07 21:23:03.572233 master-0 kubenswrapper[16352]: I0307 21:23:03.572203 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86d86fcf49-hgbkg" event={"ID":"6deed9a9-6702-4177-a35d-58ad9930a893","Type":"ContainerDied","Data":"9a3242defcab78a5704c3ac516165c6355f42a0842d58543e6938dbfa54c0dc4"} Mar 07 21:23:03.572309 master-0 kubenswrapper[16352]: I0307 21:23:03.572250 16352 scope.go:117] "RemoveContainer" containerID="ad261fabb7ddabed91944dcee1de6f4489253aa6b0b8c94f1078f8b07e107a86" Mar 07 21:23:03.577790 master-0 kubenswrapper[16352]: I0307 21:23:03.577752 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d","Type":"ContainerStarted","Data":"050e30128b0cdbce0f0b53b881d9261c6be2ceecd72c21722431553b7c58b6a4"} Mar 07 21:23:03.577901 master-0 kubenswrapper[16352]: I0307 21:23:03.577886 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d","Type":"ContainerStarted","Data":"54cddc214c657eb2c7e08e1f1a8b46f428acef319964a3a20155b957d275dec9"} Mar 07 21:23:03.584294 master-0 kubenswrapper[16352]: I0307 21:23:03.584150 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"72b4d517-f9c1-4fb2-9217-bd02b6838b07","Type":"ContainerStarted","Data":"22abd09e7dbeec5e7d0ea14165d6b45aaab0cf611196679414518d4f52358f7f"} Mar 07 21:23:03.584294 master-0 kubenswrapper[16352]: I0307 21:23:03.584250 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"72b4d517-f9c1-4fb2-9217-bd02b6838b07","Type":"ContainerStarted","Data":"f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df"} Mar 07 21:23:03.585640 master-0 kubenswrapper[16352]: I0307 21:23:03.585604 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerStarted","Data":"b6d6be69bca0675d073552dfe02cec1c5e47fac746e07e8d15c549a48ffeea21"} Mar 07 21:23:03.591479 master-0 kubenswrapper[16352]: I0307 21:23:03.591436 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"ccdc3057f0c5390145addb4536e286a11c76db79587c90eacf6c9d0ef0c38b2e"} Mar 07 21:23:03.591566 master-0 kubenswrapper[16352]: I0307 21:23:03.591491 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"bf67172d49ecbb34d6bd13a173d3bb7089a0a5c3c57b4c66dd6aa4c8fd2f11fc"} Mar 07 21:23:03.591566 master-0 kubenswrapper[16352]: I0307 21:23:03.591508 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"a8467b33ca9f402e1c71e8d3f6e98a801a76f328df657df0db479d82b62f50e3"} Mar 07 21:23:03.591566 master-0 kubenswrapper[16352]: I0307 21:23:03.591522 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"80d8dc37b0ac916e93ffc8cf045352f9940999409a2ec6b57f769d6ce37829e8"} Mar 07 21:23:03.600936 master-0 kubenswrapper[16352]: I0307 21:23:03.600834 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"96e31400-86e3-46d2-97ee-12fd3e17893a","Type":"ContainerStarted","Data":"2045efb936aa65816e8bbac215c1cd80d641c4f9280cb254d3b2b050096e0d95"} Mar 07 21:23:03.603613 master-0 kubenswrapper[16352]: I0307 21:23:03.603379 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:23:03.605953 master-0 kubenswrapper[16352]: I0307 21:23:03.605905 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" event={"ID":"f7126852-ef72-4d71-bb9b-b22cb6935adf","Type":"ContainerStarted","Data":"6d3c8b165236ad5860ba0656df9c288632457f4f6270f244783e71c890523b0d"} Mar 07 21:23:03.606050 master-0 kubenswrapper[16352]: I0307 21:23:03.605955 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" event={"ID":"f7126852-ef72-4d71-bb9b-b22cb6935adf","Type":"ContainerStarted","Data":"d61b589070e5c7beedcd47860d2753b540b8198f5926cc9aee518667f2e8375d"} Mar 07 21:23:03.606199 master-0 kubenswrapper[16352]: I0307 21:23:03.606164 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:03.606796 master-0 kubenswrapper[16352]: I0307 21:23:03.606720 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86d86fcf49-hgbkg"] Mar 07 21:23:03.616073 master-0 kubenswrapper[16352]: I0307 21:23:03.615673 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"23d4915d-4b88-4875-b794-414b5b7a1d7b","Type":"ContainerStarted","Data":"96332d6354d7124cab0f34581c5ba824a7340e0b1a13f33f2c5fec692c2a0562"} Mar 07 21:23:03.616073 master-0 kubenswrapper[16352]: I0307 21:23:03.615741 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"23d4915d-4b88-4875-b794-414b5b7a1d7b","Type":"ContainerStarted","Data":"5566788b071694c42c6559fe212f6c8811fc2b86aa74412235ba1fd67b86ee63"} Mar 07 21:23:03.623495 master-0 kubenswrapper[16352]: I0307 21:23:03.623439 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-84f57b9877-dwqg9" event={"ID":"59a192e8-491e-405e-955e-c293b335634d","Type":"ContainerStarted","Data":"1789a91318661ae84efd9e0cc1bdb4c92c4927d945a73ad8381efa70d3324c4b"} Mar 07 21:23:03.625134 master-0 kubenswrapper[16352]: I0307 21:23:03.625012 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-master-0" podStartSLOduration=8.624993438 podStartE2EDuration="8.624993438s" podCreationTimestamp="2026-03-07 21:22:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:03.618640805 +0000 UTC m=+306.689345864" watchObservedRunningTime="2026-03-07 21:23:03.624993438 +0000 UTC m=+306.695698487" Mar 07 21:23:03.625302 master-0 kubenswrapper[16352]: I0307 21:23:03.625253 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:23:03.627380 master-0 kubenswrapper[16352]: I0307 21:23:03.627329 16352 patch_prober.go:28] interesting pod/downloads-84f57b9877-dwqg9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 07 21:23:03.627446 master-0 kubenswrapper[16352]: I0307 21:23:03.627411 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-dwqg9" podUID="59a192e8-491e-405e-955e-c293b335634d" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 07 21:23:03.648135 master-0 kubenswrapper[16352]: I0307 21:23:03.648043 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-4-master-0" podStartSLOduration=5.648020804 podStartE2EDuration="5.648020804s" podCreationTimestamp="2026-03-07 21:22:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:03.644109629 +0000 UTC m=+306.714814708" watchObservedRunningTime="2026-03-07 21:23:03.648020804 +0000 UTC m=+306.718725853" Mar 07 21:23:03.672051 master-0 kubenswrapper[16352]: I0307 21:23:03.671977 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-master-0" podStartSLOduration=2.671957021 podStartE2EDuration="2.671957021s" podCreationTimestamp="2026-03-07 21:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:03.66571639 +0000 UTC m=+306.736421449" watchObservedRunningTime="2026-03-07 21:23:03.671957021 +0000 UTC m=+306.742662080" Mar 07 21:23:03.694972 master-0 kubenswrapper[16352]: I0307 21:23:03.694877 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-84f57b9877-dwqg9" podStartSLOduration=3.981853733 podStartE2EDuration="42.694845273s" podCreationTimestamp="2026-03-07 21:22:21 +0000 UTC" firstStartedPulling="2026-03-07 21:22:23.830327379 +0000 UTC m=+266.901032458" lastFinishedPulling="2026-03-07 21:23:02.543318939 +0000 UTC m=+305.614023998" observedRunningTime="2026-03-07 21:23:03.691666906 +0000 UTC m=+306.762371965" watchObservedRunningTime="2026-03-07 21:23:03.694845273 +0000 UTC m=+306.765550332" Mar 07 21:23:03.719543 master-0 kubenswrapper[16352]: I0307 21:23:03.719391 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/installer-2-master-0" podStartSLOduration=13.719366435 podStartE2EDuration="13.719366435s" podCreationTimestamp="2026-03-07 21:22:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:03.713017111 +0000 UTC m=+306.783722240" watchObservedRunningTime="2026-03-07 21:23:03.719366435 +0000 UTC m=+306.790071504" Mar 07 21:23:03.747907 master-0 kubenswrapper[16352]: I0307 21:23:03.747823 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" podStartSLOduration=15.74779927 podStartE2EDuration="15.74779927s" podCreationTimestamp="2026-03-07 21:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:03.744915701 +0000 UTC m=+306.815620770" watchObservedRunningTime="2026-03-07 21:23:03.74779927 +0000 UTC m=+306.818504339" Mar 07 21:23:04.035470 master-0 kubenswrapper[16352]: I0307 21:23:04.035412 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d8686f75f-9t2lk" Mar 07 21:23:04.640505 master-0 kubenswrapper[16352]: I0307 21:23:04.640350 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"5e2780e2e7c98c2c748c388c7ed2efb0b206c5e5cf8c84e599be58fe861624bd"} Mar 07 21:23:04.640505 master-0 kubenswrapper[16352]: I0307 21:23:04.640415 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerStarted","Data":"2838178b4f1ff502863f7dcc3d2546a40d8338e52a2be0345ddd81cb81b6cfa5"} Mar 07 21:23:04.644513 master-0 kubenswrapper[16352]: I0307 21:23:04.644461 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"96e31400-86e3-46d2-97ee-12fd3e17893a","Type":"ContainerStarted","Data":"d85efd8cc28e3eb5fccfdb120c70d3e50513bd9dfd8370494bcc501d11ca0703"} Mar 07 21:23:04.645315 master-0 kubenswrapper[16352]: I0307 21:23:04.645269 16352 patch_prober.go:28] interesting pod/downloads-84f57b9877-dwqg9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 07 21:23:04.645482 master-0 kubenswrapper[16352]: I0307 21:23:04.645436 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-dwqg9" podUID="59a192e8-491e-405e-955e-c293b335634d" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 07 21:23:04.700946 master-0 kubenswrapper[16352]: I0307 21:23:04.700816 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=254.640940752 podStartE2EDuration="4m32.700783575s" podCreationTimestamp="2026-03-07 21:18:32 +0000 UTC" firstStartedPulling="2026-03-07 21:22:44.307982903 +0000 UTC m=+287.378688002" lastFinishedPulling="2026-03-07 21:23:02.367825766 +0000 UTC m=+305.438530825" observedRunningTime="2026-03-07 21:23:04.694792981 +0000 UTC m=+307.765498050" watchObservedRunningTime="2026-03-07 21:23:04.700783575 +0000 UTC m=+307.771488634" Mar 07 21:23:05.018759 master-0 kubenswrapper[16352]: I0307 21:23:05.018563 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-68f988879c-j2dj6"] Mar 07 21:23:05.019069 master-0 kubenswrapper[16352]: E0307 21:23:05.019039 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" containerName="controller-manager" Mar 07 21:23:05.019069 master-0 kubenswrapper[16352]: I0307 21:23:05.019061 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" containerName="controller-manager" Mar 07 21:23:05.019258 master-0 kubenswrapper[16352]: I0307 21:23:05.019227 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" containerName="controller-manager" Mar 07 21:23:05.019868 master-0 kubenswrapper[16352]: I0307 21:23:05.019820 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.023632 master-0 kubenswrapper[16352]: I0307 21:23:05.023130 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.027587 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f988879c-j2dj6"] Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.027704 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l888p" Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.028096 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.029127 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.029808 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:23:05.039786 master-0 kubenswrapper[16352]: I0307 21:23:05.030194 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:23:05.044709 master-0 kubenswrapper[16352]: I0307 21:23:05.042814 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:23:05.129815 master-0 kubenswrapper[16352]: I0307 21:23:05.129737 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-client-ca\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.130142 master-0 kubenswrapper[16352]: I0307 21:23:05.130079 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g65qk\" (UniqueName: \"kubernetes.io/projected/2957024f-9646-499f-913c-90b81f01eecd-kube-api-access-g65qk\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.130398 master-0 kubenswrapper[16352]: I0307 21:23:05.130328 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-config\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.130531 master-0 kubenswrapper[16352]: I0307 21:23:05.130502 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2957024f-9646-499f-913c-90b81f01eecd-serving-cert\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.130604 master-0 kubenswrapper[16352]: I0307 21:23:05.130578 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-proxy-ca-bundles\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.205097 master-0 kubenswrapper[16352]: I0307 21:23:05.204999 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6deed9a9-6702-4177-a35d-58ad9930a893" path="/var/lib/kubelet/pods/6deed9a9-6702-4177-a35d-58ad9930a893/volumes" Mar 07 21:23:05.232612 master-0 kubenswrapper[16352]: I0307 21:23:05.232538 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-config\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.233099 master-0 kubenswrapper[16352]: I0307 21:23:05.233001 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2957024f-9646-499f-913c-90b81f01eecd-serving-cert\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.233190 master-0 kubenswrapper[16352]: I0307 21:23:05.233161 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-proxy-ca-bundles\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.233635 master-0 kubenswrapper[16352]: I0307 21:23:05.233559 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-client-ca\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.233915 master-0 kubenswrapper[16352]: I0307 21:23:05.233878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g65qk\" (UniqueName: \"kubernetes.io/projected/2957024f-9646-499f-913c-90b81f01eecd-kube-api-access-g65qk\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.235108 master-0 kubenswrapper[16352]: I0307 21:23:05.235074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-client-ca\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.235570 master-0 kubenswrapper[16352]: I0307 21:23:05.235491 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-proxy-ca-bundles\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.235976 master-0 kubenswrapper[16352]: I0307 21:23:05.235925 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2957024f-9646-499f-913c-90b81f01eecd-config\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.239155 master-0 kubenswrapper[16352]: I0307 21:23:05.239122 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2957024f-9646-499f-913c-90b81f01eecd-serving-cert\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.270071 master-0 kubenswrapper[16352]: I0307 21:23:05.269912 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g65qk\" (UniqueName: \"kubernetes.io/projected/2957024f-9646-499f-913c-90b81f01eecd-kube-api-access-g65qk\") pod \"controller-manager-68f988879c-j2dj6\" (UID: \"2957024f-9646-499f-913c-90b81f01eecd\") " pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.368191 master-0 kubenswrapper[16352]: I0307 21:23:05.368115 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:05.655029 master-0 kubenswrapper[16352]: I0307 21:23:05.654938 16352 patch_prober.go:28] interesting pod/downloads-84f57b9877-dwqg9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" start-of-body= Mar 07 21:23:05.655785 master-0 kubenswrapper[16352]: I0307 21:23:05.655017 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-84f57b9877-dwqg9" podUID="59a192e8-491e-405e-955e-c293b335634d" containerName="download-server" probeResult="failure" output="Get \"http://10.128.0.89:8080/\": dial tcp 10.128.0.89:8080: connect: connection refused" Mar 07 21:23:06.248954 master-0 kubenswrapper[16352]: I0307 21:23:06.248858 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-68f988879c-j2dj6"] Mar 07 21:23:07.856702 master-0 kubenswrapper[16352]: I0307 21:23:07.856613 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:23:08.335395 master-0 kubenswrapper[16352]: W0307 21:23:08.335196 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2957024f_9646_499f_913c_90b81f01eecd.slice/crio-9dce8f10bc9739378c630ef4c955d6aab86e276d4e7518caa8c81aef66baf38a WatchSource:0}: Error finding container 9dce8f10bc9739378c630ef4c955d6aab86e276d4e7518caa8c81aef66baf38a: Status 404 returned error can't find the container with id 9dce8f10bc9739378c630ef4c955d6aab86e276d4e7518caa8c81aef66baf38a Mar 07 21:23:08.696936 master-0 kubenswrapper[16352]: I0307 21:23:08.696833 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" event={"ID":"2957024f-9646-499f-913c-90b81f01eecd","Type":"ContainerStarted","Data":"9dce8f10bc9739378c630ef4c955d6aab86e276d4e7518caa8c81aef66baf38a"} Mar 07 21:23:09.707414 master-0 kubenswrapper[16352]: I0307 21:23:09.707330 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerStarted","Data":"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371"} Mar 07 21:23:09.709609 master-0 kubenswrapper[16352]: I0307 21:23:09.709538 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" event={"ID":"2957024f-9646-499f-913c-90b81f01eecd","Type":"ContainerStarted","Data":"d1098107652edaefa736d099e7020ce79c96c3a73438a03277842e63addc39cd"} Mar 07 21:23:09.709967 master-0 kubenswrapper[16352]: I0307 21:23:09.709898 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:09.716591 master-0 kubenswrapper[16352]: I0307 21:23:09.716498 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:23:09.739877 master-0 kubenswrapper[16352]: I0307 21:23:09.739657 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-64d844fb5f-9b28j" podStartSLOduration=16.078060575 podStartE2EDuration="21.739614949s" podCreationTimestamp="2026-03-07 21:22:48 +0000 UTC" firstStartedPulling="2026-03-07 21:23:03.105697993 +0000 UTC m=+306.176403052" lastFinishedPulling="2026-03-07 21:23:08.767252327 +0000 UTC m=+311.837957426" observedRunningTime="2026-03-07 21:23:09.737037967 +0000 UTC m=+312.807743046" watchObservedRunningTime="2026-03-07 21:23:09.739614949 +0000 UTC m=+312.810320048" Mar 07 21:23:09.771584 master-0 kubenswrapper[16352]: I0307 21:23:09.770472 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" podStartSLOduration=21.770442062 podStartE2EDuration="21.770442062s" podCreationTimestamp="2026-03-07 21:22:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:09.761138609 +0000 UTC m=+312.831843678" watchObservedRunningTime="2026-03-07 21:23:09.770442062 +0000 UTC m=+312.841147141" Mar 07 21:23:10.918009 master-0 kubenswrapper[16352]: I0307 21:23:10.917924 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:23:13.336941 master-0 kubenswrapper[16352]: I0307 21:23:13.336831 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" podUID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" containerName="oauth-openshift" containerID="cri-o://180b2e86f8e1962ba6e7528073f3f3f6b9cd411fcee0a25b278b1a7f919e78b5" gracePeriod=15 Mar 07 21:23:13.346054 master-0 kubenswrapper[16352]: I0307 21:23:13.345919 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-84f57b9877-dwqg9" Mar 07 21:23:13.752661 master-0 kubenswrapper[16352]: I0307 21:23:13.752549 16352 generic.go:334] "Generic (PLEG): container finished" podID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" containerID="180b2e86f8e1962ba6e7528073f3f3f6b9cd411fcee0a25b278b1a7f919e78b5" exitCode=0 Mar 07 21:23:13.752661 master-0 kubenswrapper[16352]: I0307 21:23:13.752632 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" event={"ID":"e7e05fee-e85a-41b9-b3cb-64658e71bf1a","Type":"ContainerDied","Data":"180b2e86f8e1962ba6e7528073f3f3f6b9cd411fcee0a25b278b1a7f919e78b5"} Mar 07 21:23:14.503860 master-0 kubenswrapper[16352]: I0307 21:23:14.503816 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:23:14.560544 master-0 kubenswrapper[16352]: I0307 21:23:14.560460 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560544 master-0 kubenswrapper[16352]: I0307 21:23:14.560528 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560544 master-0 kubenswrapper[16352]: I0307 21:23:14.560565 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560989 master-0 kubenswrapper[16352]: I0307 21:23:14.560591 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560989 master-0 kubenswrapper[16352]: I0307 21:23:14.560659 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560989 master-0 kubenswrapper[16352]: I0307 21:23:14.560686 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.560989 master-0 kubenswrapper[16352]: I0307 21:23:14.560749 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4lqg9\" (UniqueName: \"kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561184 master-0 kubenswrapper[16352]: I0307 21:23:14.561097 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561451 master-0 kubenswrapper[16352]: I0307 21:23:14.561396 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561564 master-0 kubenswrapper[16352]: I0307 21:23:14.561522 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561615 master-0 kubenswrapper[16352]: I0307 21:23:14.561562 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:14.561615 master-0 kubenswrapper[16352]: I0307 21:23:14.561591 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561683 master-0 kubenswrapper[16352]: I0307 21:23:14.561661 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561774 master-0 kubenswrapper[16352]: I0307 21:23:14.561712 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:23:14.561774 master-0 kubenswrapper[16352]: I0307 21:23:14.561757 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig\") pod \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\" (UID: \"e7e05fee-e85a-41b9-b3cb-64658e71bf1a\") " Mar 07 21:23:14.561868 master-0 kubenswrapper[16352]: I0307 21:23:14.561779 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:14.562237 master-0 kubenswrapper[16352]: I0307 21:23:14.562200 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:14.562584 master-0 kubenswrapper[16352]: I0307 21:23:14.562517 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:23:14.562932 master-0 kubenswrapper[16352]: I0307 21:23:14.562900 16352 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.562991 master-0 kubenswrapper[16352]: I0307 21:23:14.562929 16352 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.562991 master-0 kubenswrapper[16352]: I0307 21:23:14.562949 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.562991 master-0 kubenswrapper[16352]: I0307 21:23:14.562965 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.562991 master-0 kubenswrapper[16352]: I0307 21:23:14.562977 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.564705 master-0 kubenswrapper[16352]: I0307 21:23:14.564648 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.565142 master-0 kubenswrapper[16352]: I0307 21:23:14.565099 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.565428 master-0 kubenswrapper[16352]: I0307 21:23:14.565375 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.565505 master-0 kubenswrapper[16352]: I0307 21:23:14.565425 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.566077 master-0 kubenswrapper[16352]: I0307 21:23:14.566013 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.566616 master-0 kubenswrapper[16352]: I0307 21:23:14.566570 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9" (OuterVolumeSpecName: "kube-api-access-4lqg9") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "kube-api-access-4lqg9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:23:14.566616 master-0 kubenswrapper[16352]: I0307 21:23:14.566577 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.569134 master-0 kubenswrapper[16352]: I0307 21:23:14.568985 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "e7e05fee-e85a-41b9-b3cb-64658e71bf1a" (UID: "e7e05fee-e85a-41b9-b3cb-64658e71bf1a"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:23:14.664524 master-0 kubenswrapper[16352]: I0307 21:23:14.664444 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664524 master-0 kubenswrapper[16352]: I0307 21:23:14.664509 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664524 master-0 kubenswrapper[16352]: I0307 21:23:14.664527 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664855 master-0 kubenswrapper[16352]: I0307 21:23:14.664541 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664855 master-0 kubenswrapper[16352]: I0307 21:23:14.664558 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664855 master-0 kubenswrapper[16352]: I0307 21:23:14.664571 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664855 master-0 kubenswrapper[16352]: I0307 21:23:14.664586 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4lqg9\" (UniqueName: \"kubernetes.io/projected/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-kube-api-access-4lqg9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.664855 master-0 kubenswrapper[16352]: I0307 21:23:14.664600 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e05fee-e85a-41b9-b3cb-64658e71bf1a-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:14.763122 master-0 kubenswrapper[16352]: I0307 21:23:14.762936 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" event={"ID":"e7e05fee-e85a-41b9-b3cb-64658e71bf1a","Type":"ContainerDied","Data":"abea292beb2f62f6cb8c121335eac995a6de858aeef5197b723a613afbc0ad5e"} Mar 07 21:23:14.763122 master-0 kubenswrapper[16352]: I0307 21:23:14.763022 16352 scope.go:117] "RemoveContainer" containerID="180b2e86f8e1962ba6e7528073f3f3f6b9cd411fcee0a25b278b1a7f919e78b5" Mar 07 21:23:14.763122 master-0 kubenswrapper[16352]: I0307 21:23:14.763058 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-67c6dd6955-hbksv" Mar 07 21:23:15.347220 master-0 kubenswrapper[16352]: I0307 21:23:15.345147 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:23:15.347220 master-0 kubenswrapper[16352]: E0307 21:23:15.345634 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" containerName="oauth-openshift" Mar 07 21:23:15.347220 master-0 kubenswrapper[16352]: I0307 21:23:15.345650 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" containerName="oauth-openshift" Mar 07 21:23:15.347220 master-0 kubenswrapper[16352]: I0307 21:23:15.345903 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" containerName="oauth-openshift" Mar 07 21:23:15.347220 master-0 kubenswrapper[16352]: I0307 21:23:15.346455 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353324 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353399 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353618 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353685 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353787 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353912 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353965 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.353987 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.354017 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.354363 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.354414 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 21:23:15.356446 master-0 kubenswrapper[16352]: I0307 21:23:15.354358 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-x8pfn" Mar 07 21:23:15.364771 master-0 kubenswrapper[16352]: I0307 21:23:15.364686 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 21:23:15.371751 master-0 kubenswrapper[16352]: I0307 21:23:15.371674 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 21:23:15.480948 master-0 kubenswrapper[16352]: I0307 21:23:15.480854 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.481493 master-0 kubenswrapper[16352]: I0307 21:23:15.481445 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.481813 master-0 kubenswrapper[16352]: I0307 21:23:15.481768 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.482099 master-0 kubenswrapper[16352]: I0307 21:23:15.482054 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.482364 master-0 kubenswrapper[16352]: I0307 21:23:15.482319 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.482616 master-0 kubenswrapper[16352]: I0307 21:23:15.482577 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.482981 master-0 kubenswrapper[16352]: I0307 21:23:15.482934 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.483219 master-0 kubenswrapper[16352]: I0307 21:23:15.483185 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.483479 master-0 kubenswrapper[16352]: I0307 21:23:15.483448 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.483907 master-0 kubenswrapper[16352]: I0307 21:23:15.483878 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.484134 master-0 kubenswrapper[16352]: I0307 21:23:15.484104 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.484314 master-0 kubenswrapper[16352]: I0307 21:23:15.484283 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.484548 master-0 kubenswrapper[16352]: I0307 21:23:15.484502 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml8xn\" (UniqueName: \"kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.587189 master-0 kubenswrapper[16352]: I0307 21:23:15.587065 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.587189 master-0 kubenswrapper[16352]: I0307 21:23:15.587147 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587388 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587458 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587567 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587619 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587741 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587798 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587835 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587904 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587943 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587963 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.588244 master-0 kubenswrapper[16352]: I0307 21:23:15.587990 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ml8xn\" (UniqueName: \"kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.589078 master-0 kubenswrapper[16352]: I0307 21:23:15.588347 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.589078 master-0 kubenswrapper[16352]: I0307 21:23:15.588437 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.590633 master-0 kubenswrapper[16352]: I0307 21:23:15.590580 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.591166 master-0 kubenswrapper[16352]: I0307 21:23:15.590803 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.591166 master-0 kubenswrapper[16352]: I0307 21:23:15.590964 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.593264 master-0 kubenswrapper[16352]: I0307 21:23:15.593198 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.593516 master-0 kubenswrapper[16352]: I0307 21:23:15.593364 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.594614 master-0 kubenswrapper[16352]: I0307 21:23:15.594563 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.595619 master-0 kubenswrapper[16352]: I0307 21:23:15.595575 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.595874 master-0 kubenswrapper[16352]: I0307 21:23:15.595824 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.596259 master-0 kubenswrapper[16352]: I0307 21:23:15.596219 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.597294 master-0 kubenswrapper[16352]: I0307 21:23:15.597178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:15.649413 master-0 kubenswrapper[16352]: I0307 21:23:15.649324 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:23:15.938895 master-0 kubenswrapper[16352]: I0307 21:23:15.938684 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:23:16.038187 master-0 kubenswrapper[16352]: I0307 21:23:16.038073 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-67c6dd6955-hbksv"] Mar 07 21:23:16.501862 master-0 kubenswrapper[16352]: I0307 21:23:16.501750 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml8xn\" (UniqueName: \"kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn\") pod \"oauth-openshift-698d9d45c9-5wh7z\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:16.590301 master-0 kubenswrapper[16352]: I0307 21:23:16.590177 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:17.203946 master-0 kubenswrapper[16352]: I0307 21:23:17.203834 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e05fee-e85a-41b9-b3cb-64658e71bf1a" path="/var/lib/kubelet/pods/e7e05fee-e85a-41b9-b3cb-64658e71bf1a/volumes" Mar 07 21:23:17.555022 master-0 kubenswrapper[16352]: I0307 21:23:17.554914 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:23:17.806921 master-0 kubenswrapper[16352]: I0307 21:23:17.806751 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" event={"ID":"86a3c1de-a810-4b48-be89-1b05da316a28","Type":"ContainerStarted","Data":"1efe282d92b3c51f1f47e2abc1e2f6213386af1e76b2de2bb7c7c90e7f88b389"} Mar 07 21:23:18.576805 master-0 kubenswrapper[16352]: I0307 21:23:18.576645 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:23:18.576805 master-0 kubenswrapper[16352]: I0307 21:23:18.576743 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:23:18.578669 master-0 kubenswrapper[16352]: I0307 21:23:18.578577 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:23:18.578832 master-0 kubenswrapper[16352]: I0307 21:23:18.578712 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:23:18.816882 master-0 kubenswrapper[16352]: I0307 21:23:18.816779 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" event={"ID":"86a3c1de-a810-4b48-be89-1b05da316a28","Type":"ContainerStarted","Data":"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f"} Mar 07 21:23:18.819225 master-0 kubenswrapper[16352]: I0307 21:23:18.817180 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:18.822885 master-0 kubenswrapper[16352]: I0307 21:23:18.822811 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:23:19.305581 master-0 kubenswrapper[16352]: I0307 21:23:19.304654 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" podStartSLOduration=9.304614751999999 podStartE2EDuration="9.304614752s" podCreationTimestamp="2026-03-07 21:23:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:23:19.302124181 +0000 UTC m=+322.372829270" watchObservedRunningTime="2026-03-07 21:23:19.304614752 +0000 UTC m=+322.375319861" Mar 07 21:23:21.637175 master-0 kubenswrapper[16352]: I0307 21:23:21.637082 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_c34534f8-0d38-40a8-a28c-11c20ce64353/installer/0.log" Mar 07 21:23:21.637175 master-0 kubenswrapper[16352]: I0307 21:23:21.637176 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:23:21.764455 master-0 kubenswrapper[16352]: I0307 21:23:21.764307 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock\") pod \"c34534f8-0d38-40a8-a28c-11c20ce64353\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " Mar 07 21:23:21.764455 master-0 kubenswrapper[16352]: I0307 21:23:21.764378 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir\") pod \"c34534f8-0d38-40a8-a28c-11c20ce64353\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " Mar 07 21:23:21.764959 master-0 kubenswrapper[16352]: I0307 21:23:21.764488 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access\") pod \"c34534f8-0d38-40a8-a28c-11c20ce64353\" (UID: \"c34534f8-0d38-40a8-a28c-11c20ce64353\") " Mar 07 21:23:21.764959 master-0 kubenswrapper[16352]: I0307 21:23:21.764506 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock" (OuterVolumeSpecName: "var-lock") pod "c34534f8-0d38-40a8-a28c-11c20ce64353" (UID: "c34534f8-0d38-40a8-a28c-11c20ce64353"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:23:21.764959 master-0 kubenswrapper[16352]: I0307 21:23:21.764561 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c34534f8-0d38-40a8-a28c-11c20ce64353" (UID: "c34534f8-0d38-40a8-a28c-11c20ce64353"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:23:21.765171 master-0 kubenswrapper[16352]: I0307 21:23:21.765146 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:21.765171 master-0 kubenswrapper[16352]: I0307 21:23:21.765167 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c34534f8-0d38-40a8-a28c-11c20ce64353-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:21.768537 master-0 kubenswrapper[16352]: I0307 21:23:21.768485 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c34534f8-0d38-40a8-a28c-11c20ce64353" (UID: "c34534f8-0d38-40a8-a28c-11c20ce64353"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:23:21.850503 master-0 kubenswrapper[16352]: I0307 21:23:21.850235 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-3-master-0_c34534f8-0d38-40a8-a28c-11c20ce64353/installer/0.log" Mar 07 21:23:21.850503 master-0 kubenswrapper[16352]: I0307 21:23:21.850336 16352 generic.go:334] "Generic (PLEG): container finished" podID="c34534f8-0d38-40a8-a28c-11c20ce64353" containerID="63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990" exitCode=1 Mar 07 21:23:21.850503 master-0 kubenswrapper[16352]: I0307 21:23:21.850391 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"c34534f8-0d38-40a8-a28c-11c20ce64353","Type":"ContainerDied","Data":"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990"} Mar 07 21:23:21.850503 master-0 kubenswrapper[16352]: I0307 21:23:21.850425 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-3-master-0" event={"ID":"c34534f8-0d38-40a8-a28c-11c20ce64353","Type":"ContainerDied","Data":"ad3ef05c4fcf71fd72985b9596a198bcc59854b6714fb20c4ba41f018577c4dd"} Mar 07 21:23:21.850503 master-0 kubenswrapper[16352]: I0307 21:23:21.850446 16352 scope.go:117] "RemoveContainer" containerID="63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990" Mar 07 21:23:21.851017 master-0 kubenswrapper[16352]: I0307 21:23:21.850577 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-3-master-0" Mar 07 21:23:21.867328 master-0 kubenswrapper[16352]: I0307 21:23:21.867096 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c34534f8-0d38-40a8-a28c-11c20ce64353-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:21.883569 master-0 kubenswrapper[16352]: I0307 21:23:21.883493 16352 scope.go:117] "RemoveContainer" containerID="63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990" Mar 07 21:23:21.884575 master-0 kubenswrapper[16352]: E0307 21:23:21.884486 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990\": container with ID starting with 63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990 not found: ID does not exist" containerID="63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990" Mar 07 21:23:21.884638 master-0 kubenswrapper[16352]: I0307 21:23:21.884584 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990"} err="failed to get container status \"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990\": rpc error: code = NotFound desc = could not find container \"63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990\": container with ID starting with 63e6bcd0170aee0c6b43cd07739081d113b8e5b7ba9f7cad2ce6edca7d8bc990 not found: ID does not exist" Mar 07 21:23:21.907898 master-0 kubenswrapper[16352]: I0307 21:23:21.907789 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:23:21.913286 master-0 kubenswrapper[16352]: I0307 21:23:21.913174 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/installer-3-master-0"] Mar 07 21:23:23.208448 master-0 kubenswrapper[16352]: I0307 21:23:23.208327 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c34534f8-0d38-40a8-a28c-11c20ce64353" path="/var/lib/kubelet/pods/c34534f8-0d38-40a8-a28c-11c20ce64353/volumes" Mar 07 21:23:28.577379 master-0 kubenswrapper[16352]: I0307 21:23:28.577270 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:23:28.578439 master-0 kubenswrapper[16352]: I0307 21:23:28.577373 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:23:34.616563 master-0 kubenswrapper[16352]: I0307 21:23:34.616469 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:23:34.617627 master-0 kubenswrapper[16352]: I0307 21:23:34.617236 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" containerID="cri-o://409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" gracePeriod=30 Mar 07 21:23:34.617627 master-0 kubenswrapper[16352]: I0307 21:23:34.617256 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" containerID="cri-o://71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" gracePeriod=30 Mar 07 21:23:34.617627 master-0 kubenswrapper[16352]: I0307 21:23:34.617231 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" containerID="cri-o://eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" gracePeriod=30 Mar 07 21:23:34.617627 master-0 kubenswrapper[16352]: I0307 21:23:34.617376 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" containerID="cri-o://3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" gracePeriod=30 Mar 07 21:23:34.618008 master-0 kubenswrapper[16352]: I0307 21:23:34.617333 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-etcd/etcd-master-0" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" containerID="cri-o://db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" gracePeriod=30 Mar 07 21:23:34.620268 master-0 kubenswrapper[16352]: I0307 21:23:34.620203 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:23:34.620786 master-0 kubenswrapper[16352]: E0307 21:23:34.620744 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 07 21:23:34.620786 master-0 kubenswrapper[16352]: I0307 21:23:34.620776 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: E0307 21:23:34.620808 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: I0307 21:23:34.620822 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: E0307 21:23:34.620860 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: I0307 21:23:34.620873 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: E0307 21:23:34.620894 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c34534f8-0d38-40a8-a28c-11c20ce64353" containerName="installer" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: I0307 21:23:34.620906 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c34534f8-0d38-40a8-a28c-11c20ce64353" containerName="installer" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: E0307 21:23:34.620931 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: I0307 21:23:34.620945 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: E0307 21:23:34.620966 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 07 21:23:34.620968 master-0 kubenswrapper[16352]: I0307 21:23:34.620978 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: E0307 21:23:34.620999 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621012 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: E0307 21:23:34.621040 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621053 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: E0307 21:23:34.621071 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621083 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621322 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-resources-copy" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621353 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-rev" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621378 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c34534f8-0d38-40a8-a28c-11c20ce64353" containerName="installer" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621399 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621421 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-metrics" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621446 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcdctl" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621465 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-readyz" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621484 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="etcd-ensure-env-vars" Mar 07 21:23:34.621521 master-0 kubenswrapper[16352]: I0307 21:23:34.621505 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" containerName="setup" Mar 07 21:23:34.750851 master-0 kubenswrapper[16352]: I0307 21:23:34.750729 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.751166 master-0 kubenswrapper[16352]: I0307 21:23:34.750893 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.751166 master-0 kubenswrapper[16352]: I0307 21:23:34.751014 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.751166 master-0 kubenswrapper[16352]: I0307 21:23:34.751131 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.751166 master-0 kubenswrapper[16352]: I0307 21:23:34.751155 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.751600 master-0 kubenswrapper[16352]: I0307 21:23:34.751246 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854009 master-0 kubenswrapper[16352]: I0307 21:23:34.853892 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854009 master-0 kubenswrapper[16352]: I0307 21:23:34.854005 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854336 master-0 kubenswrapper[16352]: I0307 21:23:34.854043 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-resource-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854336 master-0 kubenswrapper[16352]: I0307 21:23:34.854125 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854336 master-0 kubenswrapper[16352]: I0307 21:23:34.854060 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-data-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854336 master-0 kubenswrapper[16352]: I0307 21:23:34.854241 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854336 master-0 kubenswrapper[16352]: I0307 21:23:34.854257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-cert-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854548 master-0 kubenswrapper[16352]: I0307 21:23:34.854355 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854548 master-0 kubenswrapper[16352]: I0307 21:23:34.854275 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-static-pod-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854548 master-0 kubenswrapper[16352]: I0307 21:23:34.854441 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-log-dir\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854672 master-0 kubenswrapper[16352]: I0307 21:23:34.854550 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.854770 master-0 kubenswrapper[16352]: I0307 21:23:34.854655 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/29c709c82970b529e7b9b895aa92ef05-usr-local-bin\") pod \"etcd-master-0\" (UID: \"29c709c82970b529e7b9b895aa92ef05\") " pod="openshift-etcd/etcd-master-0" Mar 07 21:23:34.994758 master-0 kubenswrapper[16352]: I0307 21:23:34.994528 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 07 21:23:34.996260 master-0 kubenswrapper[16352]: I0307 21:23:34.996200 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 07 21:23:34.999371 master-0 kubenswrapper[16352]: I0307 21:23:34.999316 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" exitCode=2 Mar 07 21:23:34.999506 master-0 kubenswrapper[16352]: I0307 21:23:34.999370 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" exitCode=0 Mar 07 21:23:34.999506 master-0 kubenswrapper[16352]: I0307 21:23:34.999388 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" exitCode=2 Mar 07 21:23:38.577510 master-0 kubenswrapper[16352]: I0307 21:23:38.577411 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:23:38.577510 master-0 kubenswrapper[16352]: I0307 21:23:38.577504 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:23:42.856670 master-0 kubenswrapper[16352]: I0307 21:23:42.856555 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:23:42.936380 master-0 kubenswrapper[16352]: I0307 21:23:42.936287 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:23:43.136106 master-0 kubenswrapper[16352]: I0307 21:23:43.135921 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:23:48.154851 master-0 kubenswrapper[16352]: I0307 21:23:48.154770 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:23:48.155545 master-0 kubenswrapper[16352]: I0307 21:23:48.154867 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" exitCode=1 Mar 07 21:23:48.155545 master-0 kubenswrapper[16352]: I0307 21:23:48.154917 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerDied","Data":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} Mar 07 21:23:48.155639 master-0 kubenswrapper[16352]: I0307 21:23:48.155562 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:23:48.576576 master-0 kubenswrapper[16352]: I0307 21:23:48.576467 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:23:48.576941 master-0 kubenswrapper[16352]: I0307 21:23:48.576579 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:23:49.167808 master-0 kubenswrapper[16352]: I0307 21:23:49.167720 16352 generic.go:334] "Generic (PLEG): container finished" podID="23d4915d-4b88-4875-b794-414b5b7a1d7b" containerID="96332d6354d7124cab0f34581c5ba824a7340e0b1a13f33f2c5fec692c2a0562" exitCode=0 Mar 07 21:23:49.169038 master-0 kubenswrapper[16352]: I0307 21:23:49.168530 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"23d4915d-4b88-4875-b794-414b5b7a1d7b","Type":"ContainerDied","Data":"96332d6354d7124cab0f34581c5ba824a7340e0b1a13f33f2c5fec692c2a0562"} Mar 07 21:23:49.176514 master-0 kubenswrapper[16352]: I0307 21:23:49.176434 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:23:49.176661 master-0 kubenswrapper[16352]: I0307 21:23:49.176599 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a"} Mar 07 21:23:49.731747 master-0 kubenswrapper[16352]: E0307 21:23:49.731582 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:23:50.603106 master-0 kubenswrapper[16352]: I0307 21:23:50.603042 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 07 21:23:50.704514 master-0 kubenswrapper[16352]: I0307 21:23:50.704383 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock\") pod \"23d4915d-4b88-4875-b794-414b5b7a1d7b\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " Mar 07 21:23:50.704514 master-0 kubenswrapper[16352]: I0307 21:23:50.704480 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock" (OuterVolumeSpecName: "var-lock") pod "23d4915d-4b88-4875-b794-414b5b7a1d7b" (UID: "23d4915d-4b88-4875-b794-414b5b7a1d7b"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:23:50.704967 master-0 kubenswrapper[16352]: I0307 21:23:50.704757 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access\") pod \"23d4915d-4b88-4875-b794-414b5b7a1d7b\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " Mar 07 21:23:50.704967 master-0 kubenswrapper[16352]: I0307 21:23:50.704827 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir\") pod \"23d4915d-4b88-4875-b794-414b5b7a1d7b\" (UID: \"23d4915d-4b88-4875-b794-414b5b7a1d7b\") " Mar 07 21:23:50.704967 master-0 kubenswrapper[16352]: I0307 21:23:50.704932 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "23d4915d-4b88-4875-b794-414b5b7a1d7b" (UID: "23d4915d-4b88-4875-b794-414b5b7a1d7b"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:23:50.705416 master-0 kubenswrapper[16352]: I0307 21:23:50.705376 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:50.705416 master-0 kubenswrapper[16352]: I0307 21:23:50.705403 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/23d4915d-4b88-4875-b794-414b5b7a1d7b-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:50.707692 master-0 kubenswrapper[16352]: I0307 21:23:50.707632 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "23d4915d-4b88-4875-b794-414b5b7a1d7b" (UID: "23d4915d-4b88-4875-b794-414b5b7a1d7b"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:23:50.797138 master-0 kubenswrapper[16352]: E0307 21:23:50.796724 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:23:40Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:23:40Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:23:40Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:23:40Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:23:50.807288 master-0 kubenswrapper[16352]: I0307 21:23:50.807201 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/23d4915d-4b88-4875-b794-414b5b7a1d7b-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:23:51.204021 master-0 kubenswrapper[16352]: I0307 21:23:51.203949 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/installer-2-master-0" Mar 07 21:23:51.205153 master-0 kubenswrapper[16352]: I0307 21:23:51.205080 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/installer-2-master-0" event={"ID":"23d4915d-4b88-4875-b794-414b5b7a1d7b","Type":"ContainerDied","Data":"5566788b071694c42c6559fe212f6c8811fc2b86aa74412235ba1fd67b86ee63"} Mar 07 21:23:51.205228 master-0 kubenswrapper[16352]: I0307 21:23:51.205159 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5566788b071694c42c6559fe212f6c8811fc2b86aa74412235ba1fd67b86ee63" Mar 07 21:23:52.509611 master-0 kubenswrapper[16352]: I0307 21:23:52.508851 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:23:52.509611 master-0 kubenswrapper[16352]: I0307 21:23:52.508959 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:23:52.517711 master-0 kubenswrapper[16352]: I0307 21:23:52.517601 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:23:58.577282 master-0 kubenswrapper[16352]: I0307 21:23:58.577055 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:23:58.579096 master-0 kubenswrapper[16352]: I0307 21:23:58.577162 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:23:59.733378 master-0 kubenswrapper[16352]: E0307 21:23:59.733016 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:00.798065 master-0 kubenswrapper[16352]: E0307 21:24:00.797936 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:02.516428 master-0 kubenswrapper[16352]: I0307 21:24:02.516323 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:24:02.620646 master-0 kubenswrapper[16352]: I0307 21:24:02.620530 16352 scope.go:117] "RemoveContainer" containerID="d24d032319a9f87acbbf34deb36cb14122c07e93e1e3dd0d42d28beaf572ecc6" Mar 07 21:24:04.872114 master-0 kubenswrapper[16352]: I0307 21:24:04.872027 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 07 21:24:04.873897 master-0 kubenswrapper[16352]: I0307 21:24:04.873838 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 07 21:24:04.874752 master-0 kubenswrapper[16352]: I0307 21:24:04.874713 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 07 21:24:04.875322 master-0 kubenswrapper[16352]: I0307 21:24:04.875280 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 07 21:24:04.877337 master-0 kubenswrapper[16352]: I0307 21:24:04.877299 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:24:04.941913 master-0 kubenswrapper[16352]: I0307 21:24:04.941790 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.942257 master-0 kubenswrapper[16352]: I0307 21:24:04.941935 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.942545 master-0 kubenswrapper[16352]: I0307 21:24:04.942510 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.942707 master-0 kubenswrapper[16352]: I0307 21:24:04.942638 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir" (OuterVolumeSpecName: "data-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "data-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.942895 master-0 kubenswrapper[16352]: I0307 21:24:04.942867 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.943099 master-0 kubenswrapper[16352]: I0307 21:24:04.943073 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.943269 master-0 kubenswrapper[16352]: I0307 21:24:04.943244 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.943456 master-0 kubenswrapper[16352]: I0307 21:24:04.943431 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") pod \"8e52bef89f4b50e4590a1719bcc5d7e5\" (UID: \"8e52bef89f4b50e4590a1719bcc5d7e5\") " Mar 07 21:24:04.943674 master-0 kubenswrapper[16352]: I0307 21:24:04.942911 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.943674 master-0 kubenswrapper[16352]: I0307 21:24:04.943138 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir" (OuterVolumeSpecName: "static-pod-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "static-pod-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.943866 master-0 kubenswrapper[16352]: I0307 21:24:04.943316 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir" (OuterVolumeSpecName: "log-dir") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "log-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.943866 master-0 kubenswrapper[16352]: I0307 21:24:04.943486 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin" (OuterVolumeSpecName: "usr-local-bin") pod "8e52bef89f4b50e4590a1719bcc5d7e5" (UID: "8e52bef89f4b50e4590a1719bcc5d7e5"). InnerVolumeSpecName "usr-local-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:04.944513 master-0 kubenswrapper[16352]: I0307 21:24:04.944483 16352 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:04.944651 master-0 kubenswrapper[16352]: I0307 21:24:04.944630 16352 reconciler_common.go:293] "Volume detached for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-data-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:04.944835 master-0 kubenswrapper[16352]: I0307 21:24:04.944809 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:04.944972 master-0 kubenswrapper[16352]: I0307 21:24:04.944947 16352 reconciler_common.go:293] "Volume detached for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-static-pod-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:04.945170 master-0 kubenswrapper[16352]: I0307 21:24:04.945148 16352 reconciler_common.go:293] "Volume detached for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-log-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:04.945316 master-0 kubenswrapper[16352]: I0307 21:24:04.945295 16352 reconciler_common.go:293] "Volume detached for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/8e52bef89f4b50e4590a1719bcc5d7e5-usr-local-bin\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:05.204496 master-0 kubenswrapper[16352]: I0307 21:24:05.204254 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8e52bef89f4b50e4590a1719bcc5d7e5" path="/var/lib/kubelet/pods/8e52bef89f4b50e4590a1719bcc5d7e5/volumes" Mar 07 21:24:05.357306 master-0 kubenswrapper[16352]: I0307 21:24:05.357223 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-rev/0.log" Mar 07 21:24:05.360224 master-0 kubenswrapper[16352]: I0307 21:24:05.360196 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd-metrics/0.log" Mar 07 21:24:05.360882 master-0 kubenswrapper[16352]: I0307 21:24:05.360866 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcd/0.log" Mar 07 21:24:05.361420 master-0 kubenswrapper[16352]: I0307 21:24:05.361403 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-etcd_etcd-master-0_8e52bef89f4b50e4590a1719bcc5d7e5/etcdctl/0.log" Mar 07 21:24:05.362610 master-0 kubenswrapper[16352]: I0307 21:24:05.362587 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" exitCode=137 Mar 07 21:24:05.362772 master-0 kubenswrapper[16352]: I0307 21:24:05.362671 16352 generic.go:334] "Generic (PLEG): container finished" podID="8e52bef89f4b50e4590a1719bcc5d7e5" containerID="db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" exitCode=137 Mar 07 21:24:05.362885 master-0 kubenswrapper[16352]: I0307 21:24:05.362749 16352 scope.go:117] "RemoveContainer" containerID="eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" Mar 07 21:24:05.363023 master-0 kubenswrapper[16352]: I0307 21:24:05.362870 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:24:05.389901 master-0 kubenswrapper[16352]: I0307 21:24:05.389846 16352 scope.go:117] "RemoveContainer" containerID="409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" Mar 07 21:24:05.415569 master-0 kubenswrapper[16352]: I0307 21:24:05.415491 16352 scope.go:117] "RemoveContainer" containerID="71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" Mar 07 21:24:05.440577 master-0 kubenswrapper[16352]: I0307 21:24:05.440487 16352 scope.go:117] "RemoveContainer" containerID="3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" Mar 07 21:24:05.470897 master-0 kubenswrapper[16352]: I0307 21:24:05.470820 16352 scope.go:117] "RemoveContainer" containerID="db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" Mar 07 21:24:05.494751 master-0 kubenswrapper[16352]: I0307 21:24:05.494500 16352 scope.go:117] "RemoveContainer" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" Mar 07 21:24:05.519145 master-0 kubenswrapper[16352]: I0307 21:24:05.519057 16352 scope.go:117] "RemoveContainer" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" Mar 07 21:24:05.543661 master-0 kubenswrapper[16352]: I0307 21:24:05.543587 16352 scope.go:117] "RemoveContainer" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" Mar 07 21:24:05.571164 master-0 kubenswrapper[16352]: I0307 21:24:05.571082 16352 scope.go:117] "RemoveContainer" containerID="eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" Mar 07 21:24:05.571739 master-0 kubenswrapper[16352]: E0307 21:24:05.571669 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8\": container with ID starting with eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8 not found: ID does not exist" containerID="eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" Mar 07 21:24:05.571855 master-0 kubenswrapper[16352]: I0307 21:24:05.571756 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8"} err="failed to get container status \"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8\": rpc error: code = NotFound desc = could not find container \"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8\": container with ID starting with eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8 not found: ID does not exist" Mar 07 21:24:05.571855 master-0 kubenswrapper[16352]: I0307 21:24:05.571801 16352 scope.go:117] "RemoveContainer" containerID="409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" Mar 07 21:24:05.572452 master-0 kubenswrapper[16352]: E0307 21:24:05.572415 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112\": container with ID starting with 409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112 not found: ID does not exist" containerID="409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" Mar 07 21:24:05.572555 master-0 kubenswrapper[16352]: I0307 21:24:05.572457 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112"} err="failed to get container status \"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112\": rpc error: code = NotFound desc = could not find container \"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112\": container with ID starting with 409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112 not found: ID does not exist" Mar 07 21:24:05.572555 master-0 kubenswrapper[16352]: I0307 21:24:05.572486 16352 scope.go:117] "RemoveContainer" containerID="71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" Mar 07 21:24:05.572983 master-0 kubenswrapper[16352]: E0307 21:24:05.572952 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801\": container with ID starting with 71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801 not found: ID does not exist" containerID="71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" Mar 07 21:24:05.572983 master-0 kubenswrapper[16352]: I0307 21:24:05.572972 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801"} err="failed to get container status \"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801\": rpc error: code = NotFound desc = could not find container \"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801\": container with ID starting with 71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801 not found: ID does not exist" Mar 07 21:24:05.572983 master-0 kubenswrapper[16352]: I0307 21:24:05.572985 16352 scope.go:117] "RemoveContainer" containerID="3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" Mar 07 21:24:05.574094 master-0 kubenswrapper[16352]: E0307 21:24:05.573816 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24\": container with ID starting with 3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24 not found: ID does not exist" containerID="3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" Mar 07 21:24:05.574094 master-0 kubenswrapper[16352]: I0307 21:24:05.573932 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24"} err="failed to get container status \"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24\": rpc error: code = NotFound desc = could not find container \"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24\": container with ID starting with 3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24 not found: ID does not exist" Mar 07 21:24:05.574094 master-0 kubenswrapper[16352]: I0307 21:24:05.573981 16352 scope.go:117] "RemoveContainer" containerID="db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" Mar 07 21:24:05.574648 master-0 kubenswrapper[16352]: E0307 21:24:05.574581 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1\": container with ID starting with db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1 not found: ID does not exist" containerID="db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" Mar 07 21:24:05.574826 master-0 kubenswrapper[16352]: I0307 21:24:05.574645 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1"} err="failed to get container status \"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1\": rpc error: code = NotFound desc = could not find container \"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1\": container with ID starting with db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1 not found: ID does not exist" Mar 07 21:24:05.574826 master-0 kubenswrapper[16352]: I0307 21:24:05.574672 16352 scope.go:117] "RemoveContainer" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" Mar 07 21:24:05.575189 master-0 kubenswrapper[16352]: E0307 21:24:05.575141 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31\": container with ID starting with 62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31 not found: ID does not exist" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" Mar 07 21:24:05.575189 master-0 kubenswrapper[16352]: I0307 21:24:05.575178 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31"} err="failed to get container status \"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31\": rpc error: code = NotFound desc = could not find container \"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31\": container with ID starting with 62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31 not found: ID does not exist" Mar 07 21:24:05.575428 master-0 kubenswrapper[16352]: I0307 21:24:05.575204 16352 scope.go:117] "RemoveContainer" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" Mar 07 21:24:05.575752 master-0 kubenswrapper[16352]: E0307 21:24:05.575652 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2\": container with ID starting with 49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2 not found: ID does not exist" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" Mar 07 21:24:05.575881 master-0 kubenswrapper[16352]: I0307 21:24:05.575766 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2"} err="failed to get container status \"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2\": rpc error: code = NotFound desc = could not find container \"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2\": container with ID starting with 49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2 not found: ID does not exist" Mar 07 21:24:05.575881 master-0 kubenswrapper[16352]: I0307 21:24:05.575785 16352 scope.go:117] "RemoveContainer" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" Mar 07 21:24:05.576300 master-0 kubenswrapper[16352]: E0307 21:24:05.576239 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317\": container with ID starting with aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317 not found: ID does not exist" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" Mar 07 21:24:05.576505 master-0 kubenswrapper[16352]: I0307 21:24:05.576458 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317"} err="failed to get container status \"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317\": rpc error: code = NotFound desc = could not find container \"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317\": container with ID starting with aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317 not found: ID does not exist" Mar 07 21:24:05.576661 master-0 kubenswrapper[16352]: I0307 21:24:05.576638 16352 scope.go:117] "RemoveContainer" containerID="eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8" Mar 07 21:24:05.577369 master-0 kubenswrapper[16352]: I0307 21:24:05.577328 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8"} err="failed to get container status \"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8\": rpc error: code = NotFound desc = could not find container \"eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8\": container with ID starting with eee34319c0673dbff2be4903fd69ce1d7b9e6f5b848a253f9f65757105c734e8 not found: ID does not exist" Mar 07 21:24:05.577369 master-0 kubenswrapper[16352]: I0307 21:24:05.577353 16352 scope.go:117] "RemoveContainer" containerID="409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112" Mar 07 21:24:05.578224 master-0 kubenswrapper[16352]: I0307 21:24:05.578114 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112"} err="failed to get container status \"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112\": rpc error: code = NotFound desc = could not find container \"409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112\": container with ID starting with 409673ecf9a7020fd62f9f152a2b7b3f088b50d7fcb4b3f4a863ae414e8c4112 not found: ID does not exist" Mar 07 21:24:05.578224 master-0 kubenswrapper[16352]: I0307 21:24:05.578215 16352 scope.go:117] "RemoveContainer" containerID="71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801" Mar 07 21:24:05.578822 master-0 kubenswrapper[16352]: I0307 21:24:05.578770 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801"} err="failed to get container status \"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801\": rpc error: code = NotFound desc = could not find container \"71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801\": container with ID starting with 71d70d0786391280525248d1d8f56bb970473b89a94183f09b5946f4f1804801 not found: ID does not exist" Mar 07 21:24:05.578822 master-0 kubenswrapper[16352]: I0307 21:24:05.578805 16352 scope.go:117] "RemoveContainer" containerID="3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24" Mar 07 21:24:05.579254 master-0 kubenswrapper[16352]: I0307 21:24:05.579207 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24"} err="failed to get container status \"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24\": rpc error: code = NotFound desc = could not find container \"3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24\": container with ID starting with 3c0c0903536cdcdabfb00afad7761f474f969470e727a20cce3a1f1ccd8cbc24 not found: ID does not exist" Mar 07 21:24:05.579254 master-0 kubenswrapper[16352]: I0307 21:24:05.579230 16352 scope.go:117] "RemoveContainer" containerID="db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1" Mar 07 21:24:05.579738 master-0 kubenswrapper[16352]: I0307 21:24:05.579648 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1"} err="failed to get container status \"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1\": rpc error: code = NotFound desc = could not find container \"db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1\": container with ID starting with db6417130ed911468a6e3836e6d425c8a7bbc7a94e837b5fd966ca77525f1ba1 not found: ID does not exist" Mar 07 21:24:05.579738 master-0 kubenswrapper[16352]: I0307 21:24:05.579701 16352 scope.go:117] "RemoveContainer" containerID="62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31" Mar 07 21:24:05.580220 master-0 kubenswrapper[16352]: I0307 21:24:05.580159 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31"} err="failed to get container status \"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31\": rpc error: code = NotFound desc = could not find container \"62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31\": container with ID starting with 62c687cddba4141c6b390dfda9de284968f6829ab4eba64f14aea554874fab31 not found: ID does not exist" Mar 07 21:24:05.580220 master-0 kubenswrapper[16352]: I0307 21:24:05.580193 16352 scope.go:117] "RemoveContainer" containerID="49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2" Mar 07 21:24:05.580610 master-0 kubenswrapper[16352]: I0307 21:24:05.580553 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2"} err="failed to get container status \"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2\": rpc error: code = NotFound desc = could not find container \"49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2\": container with ID starting with 49723a49476e0033af47315981ffa9b7216c970b2ceea667da31c9ffb291c0f2 not found: ID does not exist" Mar 07 21:24:05.580610 master-0 kubenswrapper[16352]: I0307 21:24:05.580581 16352 scope.go:117] "RemoveContainer" containerID="aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317" Mar 07 21:24:05.581326 master-0 kubenswrapper[16352]: I0307 21:24:05.581266 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317"} err="failed to get container status \"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317\": rpc error: code = NotFound desc = could not find container \"aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317\": container with ID starting with aba308dc7f4af540fecbf44a6bbd0fc390e51e704f8c4e8b4d49a71f73a75317 not found: ID does not exist" Mar 07 21:24:08.400253 master-0 kubenswrapper[16352]: I0307 21:24:08.400147 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d/installer/0.log" Mar 07 21:24:08.400253 master-0 kubenswrapper[16352]: I0307 21:24:08.400243 16352 generic.go:334] "Generic (PLEG): container finished" podID="4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" containerID="050e30128b0cdbce0f0b53b881d9261c6be2ceecd72c21722431553b7c58b6a4" exitCode=1 Mar 07 21:24:08.401452 master-0 kubenswrapper[16352]: I0307 21:24:08.400294 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d","Type":"ContainerDied","Data":"050e30128b0cdbce0f0b53b881d9261c6be2ceecd72c21722431553b7c58b6a4"} Mar 07 21:24:08.584729 master-0 kubenswrapper[16352]: I0307 21:24:08.577177 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:08.584729 master-0 kubenswrapper[16352]: I0307 21:24:08.577258 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:24:08.666608 master-0 kubenswrapper[16352]: E0307 21:24:08.666280 16352 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189aac162fef731e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Killing,Message:Stopping container etcd-readyz,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:23:34.617191198 +0000 UTC m=+337.687896257,LastTimestamp:2026-03-07 21:23:34.617191198 +0000 UTC m=+337.687896257,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:24:09.189567 master-0 kubenswrapper[16352]: I0307 21:24:09.189426 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:24:09.223941 master-0 kubenswrapper[16352]: I0307 21:24:09.223857 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:24:09.223941 master-0 kubenswrapper[16352]: I0307 21:24:09.223917 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:24:09.733810 master-0 kubenswrapper[16352]: E0307 21:24:09.733696 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:09.912074 master-0 kubenswrapper[16352]: I0307 21:24:09.911996 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d/installer/0.log" Mar 07 21:24:09.912276 master-0 kubenswrapper[16352]: I0307 21:24:09.912110 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:24:10.048965 master-0 kubenswrapper[16352]: I0307 21:24:10.048839 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock\") pod \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " Mar 07 21:24:10.049302 master-0 kubenswrapper[16352]: I0307 21:24:10.049076 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock" (OuterVolumeSpecName: "var-lock") pod "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" (UID: "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:10.049302 master-0 kubenswrapper[16352]: I0307 21:24:10.049099 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access\") pod \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " Mar 07 21:24:10.049426 master-0 kubenswrapper[16352]: I0307 21:24:10.049315 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir\") pod \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\" (UID: \"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d\") " Mar 07 21:24:10.049552 master-0 kubenswrapper[16352]: I0307 21:24:10.049463 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" (UID: "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:10.050329 master-0 kubenswrapper[16352]: I0307 21:24:10.050267 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:10.050329 master-0 kubenswrapper[16352]: I0307 21:24:10.050323 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:10.054806 master-0 kubenswrapper[16352]: I0307 21:24:10.054611 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" (UID: "4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:24:10.152630 master-0 kubenswrapper[16352]: I0307 21:24:10.152492 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:10.424488 master-0 kubenswrapper[16352]: I0307 21:24:10.424381 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_installer-4-master-0_4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d/installer/0.log" Mar 07 21:24:10.424488 master-0 kubenswrapper[16352]: I0307 21:24:10.424491 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-master-0" event={"ID":"4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d","Type":"ContainerDied","Data":"54cddc214c657eb2c7e08e1f1a8b46f428acef319964a3a20155b957d275dec9"} Mar 07 21:24:10.425106 master-0 kubenswrapper[16352]: I0307 21:24:10.424536 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="54cddc214c657eb2c7e08e1f1a8b46f428acef319964a3a20155b957d275dec9" Mar 07 21:24:10.425106 master-0 kubenswrapper[16352]: I0307 21:24:10.424600 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-master-0" Mar 07 21:24:10.799254 master-0 kubenswrapper[16352]: E0307 21:24:10.799001 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:18.577028 master-0 kubenswrapper[16352]: I0307 21:24:18.576918 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:18.578188 master-0 kubenswrapper[16352]: I0307 21:24:18.577058 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:24:19.735149 master-0 kubenswrapper[16352]: E0307 21:24:19.735014 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:20.800471 master-0 kubenswrapper[16352]: E0307 21:24:20.800353 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:28.576656 master-0 kubenswrapper[16352]: I0307 21:24:28.576557 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:28.577600 master-0 kubenswrapper[16352]: I0307 21:24:28.576658 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:24:29.736021 master-0 kubenswrapper[16352]: E0307 21:24:29.735910 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:29.736021 master-0 kubenswrapper[16352]: I0307 21:24:29.736003 16352 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 21:24:30.801494 master-0 kubenswrapper[16352]: E0307 21:24:30.801360 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:30.801494 master-0 kubenswrapper[16352]: E0307 21:24:30.801450 16352 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 21:24:35.695261 master-0 kubenswrapper[16352]: I0307 21:24:35.695164 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/1.log" Mar 07 21:24:35.696453 master-0 kubenswrapper[16352]: I0307 21:24:35.695999 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/0.log" Mar 07 21:24:35.696546 master-0 kubenswrapper[16352]: I0307 21:24:35.696460 16352 generic.go:334] "Generic (PLEG): container finished" podID="27b149f7-6aff-45f3-b935-e65279f2f9ee" containerID="6c5c7fd45d6f80f9f78c1d57d4b829fe4f9dc0f4710c478f224a6b64ce861f57" exitCode=1 Mar 07 21:24:35.696650 master-0 kubenswrapper[16352]: I0307 21:24:35.696556 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerDied","Data":"6c5c7fd45d6f80f9f78c1d57d4b829fe4f9dc0f4710c478f224a6b64ce861f57"} Mar 07 21:24:35.696978 master-0 kubenswrapper[16352]: I0307 21:24:35.696898 16352 scope.go:117] "RemoveContainer" containerID="98d5387debce255a652d1b794239fb6ace25d54dad34766bdbf701b015ffe247" Mar 07 21:24:35.697587 master-0 kubenswrapper[16352]: I0307 21:24:35.697544 16352 scope.go:117] "RemoveContainer" containerID="6c5c7fd45d6f80f9f78c1d57d4b829fe4f9dc0f4710c478f224a6b64ce861f57" Mar 07 21:24:36.710814 master-0 kubenswrapper[16352]: I0307 21:24:36.710663 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-node-identity_network-node-identity-kpsm4_27b149f7-6aff-45f3-b935-e65279f2f9ee/approver/1.log" Mar 07 21:24:36.711872 master-0 kubenswrapper[16352]: I0307 21:24:36.711311 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-kpsm4" event={"ID":"27b149f7-6aff-45f3-b935-e65279f2f9ee","Type":"ContainerStarted","Data":"e758417fc9531cd0019fd8eb696c6c1f70f0a0b1478d779971695d29c2bdf716"} Mar 07 21:24:38.577658 master-0 kubenswrapper[16352]: I0307 21:24:38.577555 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:38.578559 master-0 kubenswrapper[16352]: I0307 21:24:38.577672 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:24:39.736374 master-0 kubenswrapper[16352]: E0307 21:24:39.736261 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": context deadline exceeded" interval="200ms" Mar 07 21:24:42.670656 master-0 kubenswrapper[16352]: E0307 21:24:42.670399 16352 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189aac162fef9768 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Killing,Message:Stopping container etcd-rev,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:23:34.617200488 +0000 UTC m=+337.687905587,LastTimestamp:2026-03-07 21:23:34.617200488 +0000 UTC m=+337.687905587,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:24:42.939002 master-0 kubenswrapper[16352]: I0307 21:24:42.938801 16352 status_manager.go:851] "Failed to get status for pod" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" pod="openshift-monitoring/prometheus-k8s-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods prometheus-k8s-0)" Mar 07 21:24:43.228081 master-0 kubenswrapper[16352]: E0307 21:24:43.227788 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 07 21:24:43.228800 master-0 kubenswrapper[16352]: I0307 21:24:43.228739 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-master-0" Mar 07 21:24:43.265850 master-0 kubenswrapper[16352]: W0307 21:24:43.265373 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod29c709c82970b529e7b9b895aa92ef05.slice/crio-6a2b6b06d62d67977a6f0de7b260ab7d95101b9a8e5688426656804b73846cb4 WatchSource:0}: Error finding container 6a2b6b06d62d67977a6f0de7b260ab7d95101b9a8e5688426656804b73846cb4: Status 404 returned error can't find the container with id 6a2b6b06d62d67977a6f0de7b260ab7d95101b9a8e5688426656804b73846cb4 Mar 07 21:24:43.796063 master-0 kubenswrapper[16352]: I0307 21:24:43.795860 16352 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="2026c6c71c72f452737624f1855128e00509d41f7bc843c6793c53fd926fd9a7" exitCode=0 Mar 07 21:24:43.796063 master-0 kubenswrapper[16352]: I0307 21:24:43.795950 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"2026c6c71c72f452737624f1855128e00509d41f7bc843c6793c53fd926fd9a7"} Mar 07 21:24:43.796063 master-0 kubenswrapper[16352]: I0307 21:24:43.796007 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"6a2b6b06d62d67977a6f0de7b260ab7d95101b9a8e5688426656804b73846cb4"} Mar 07 21:24:43.797175 master-0 kubenswrapper[16352]: I0307 21:24:43.796667 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:24:43.797175 master-0 kubenswrapper[16352]: I0307 21:24:43.796769 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:24:45.820609 master-0 kubenswrapper[16352]: I0307 21:24:45.820502 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_96e31400-86e3-46d2-97ee-12fd3e17893a/installer/0.log" Mar 07 21:24:45.821755 master-0 kubenswrapper[16352]: I0307 21:24:45.820615 16352 generic.go:334] "Generic (PLEG): container finished" podID="96e31400-86e3-46d2-97ee-12fd3e17893a" containerID="d85efd8cc28e3eb5fccfdb120c70d3e50513bd9dfd8370494bcc501d11ca0703" exitCode=1 Mar 07 21:24:45.821755 master-0 kubenswrapper[16352]: I0307 21:24:45.820783 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"96e31400-86e3-46d2-97ee-12fd3e17893a","Type":"ContainerDied","Data":"d85efd8cc28e3eb5fccfdb120c70d3e50513bd9dfd8370494bcc501d11ca0703"} Mar 07 21:24:45.824013 master-0 kubenswrapper[16352]: I0307 21:24:45.823930 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_72b4d517-f9c1-4fb2-9217-bd02b6838b07/installer/0.log" Mar 07 21:24:45.824139 master-0 kubenswrapper[16352]: I0307 21:24:45.824071 16352 generic.go:334] "Generic (PLEG): container finished" podID="72b4d517-f9c1-4fb2-9217-bd02b6838b07" containerID="22abd09e7dbeec5e7d0ea14165d6b45aaab0cf611196679414518d4f52358f7f" exitCode=1 Mar 07 21:24:45.824212 master-0 kubenswrapper[16352]: I0307 21:24:45.824147 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"72b4d517-f9c1-4fb2-9217-bd02b6838b07","Type":"ContainerDied","Data":"22abd09e7dbeec5e7d0ea14165d6b45aaab0cf611196679414518d4f52358f7f"} Mar 07 21:24:47.244795 master-0 kubenswrapper[16352]: I0307 21:24:47.242400 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_72b4d517-f9c1-4fb2-9217-bd02b6838b07/installer/0.log" Mar 07 21:24:47.244795 master-0 kubenswrapper[16352]: I0307 21:24:47.242523 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:24:47.397811 master-0 kubenswrapper[16352]: I0307 21:24:47.397617 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_96e31400-86e3-46d2-97ee-12fd3e17893a/installer/0.log" Mar 07 21:24:47.397811 master-0 kubenswrapper[16352]: I0307 21:24:47.397781 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:24:47.438940 master-0 kubenswrapper[16352]: I0307 21:24:47.437849 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access\") pod \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " Mar 07 21:24:47.438940 master-0 kubenswrapper[16352]: I0307 21:24:47.438096 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir\") pod \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " Mar 07 21:24:47.438940 master-0 kubenswrapper[16352]: I0307 21:24:47.438211 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "72b4d517-f9c1-4fb2-9217-bd02b6838b07" (UID: "72b4d517-f9c1-4fb2-9217-bd02b6838b07"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:47.438940 master-0 kubenswrapper[16352]: I0307 21:24:47.438409 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock\") pod \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\" (UID: \"72b4d517-f9c1-4fb2-9217-bd02b6838b07\") " Mar 07 21:24:47.438940 master-0 kubenswrapper[16352]: I0307 21:24:47.438739 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock" (OuterVolumeSpecName: "var-lock") pod "72b4d517-f9c1-4fb2-9217-bd02b6838b07" (UID: "72b4d517-f9c1-4fb2-9217-bd02b6838b07"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:47.440596 master-0 kubenswrapper[16352]: I0307 21:24:47.440409 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.440596 master-0 kubenswrapper[16352]: I0307 21:24:47.440458 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/72b4d517-f9c1-4fb2-9217-bd02b6838b07-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.443562 master-0 kubenswrapper[16352]: I0307 21:24:47.443467 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "72b4d517-f9c1-4fb2-9217-bd02b6838b07" (UID: "72b4d517-f9c1-4fb2-9217-bd02b6838b07"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:24:47.542048 master-0 kubenswrapper[16352]: I0307 21:24:47.541334 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock\") pod \"96e31400-86e3-46d2-97ee-12fd3e17893a\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " Mar 07 21:24:47.542048 master-0 kubenswrapper[16352]: I0307 21:24:47.541449 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir\") pod \"96e31400-86e3-46d2-97ee-12fd3e17893a\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " Mar 07 21:24:47.542048 master-0 kubenswrapper[16352]: I0307 21:24:47.541534 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock" (OuterVolumeSpecName: "var-lock") pod "96e31400-86e3-46d2-97ee-12fd3e17893a" (UID: "96e31400-86e3-46d2-97ee-12fd3e17893a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:47.542048 master-0 kubenswrapper[16352]: I0307 21:24:47.541581 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access\") pod \"96e31400-86e3-46d2-97ee-12fd3e17893a\" (UID: \"96e31400-86e3-46d2-97ee-12fd3e17893a\") " Mar 07 21:24:47.542048 master-0 kubenswrapper[16352]: I0307 21:24:47.541787 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "96e31400-86e3-46d2-97ee-12fd3e17893a" (UID: "96e31400-86e3-46d2-97ee-12fd3e17893a"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:24:47.542909 master-0 kubenswrapper[16352]: I0307 21:24:47.542805 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/72b4d517-f9c1-4fb2-9217-bd02b6838b07-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.542909 master-0 kubenswrapper[16352]: I0307 21:24:47.542846 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.542909 master-0 kubenswrapper[16352]: I0307 21:24:47.542867 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/96e31400-86e3-46d2-97ee-12fd3e17893a-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.546921 master-0 kubenswrapper[16352]: I0307 21:24:47.546862 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "96e31400-86e3-46d2-97ee-12fd3e17893a" (UID: "96e31400-86e3-46d2-97ee-12fd3e17893a"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:24:47.645387 master-0 kubenswrapper[16352]: I0307 21:24:47.645269 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/96e31400-86e3-46d2-97ee-12fd3e17893a-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:24:47.845213 master-0 kubenswrapper[16352]: I0307 21:24:47.845024 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_installer-5-master-0_96e31400-86e3-46d2-97ee-12fd3e17893a/installer/0.log" Mar 07 21:24:47.845628 master-0 kubenswrapper[16352]: I0307 21:24:47.845182 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-master-0" event={"ID":"96e31400-86e3-46d2-97ee-12fd3e17893a","Type":"ContainerDied","Data":"2045efb936aa65816e8bbac215c1cd80d641c4f9280cb254d3b2b050096e0d95"} Mar 07 21:24:47.845628 master-0 kubenswrapper[16352]: I0307 21:24:47.845238 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-master-0" Mar 07 21:24:47.845628 master-0 kubenswrapper[16352]: I0307 21:24:47.845269 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2045efb936aa65816e8bbac215c1cd80d641c4f9280cb254d3b2b050096e0d95" Mar 07 21:24:47.848722 master-0 kubenswrapper[16352]: I0307 21:24:47.848621 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_installer-4-master-0_72b4d517-f9c1-4fb2-9217-bd02b6838b07/installer/0.log" Mar 07 21:24:47.848868 master-0 kubenswrapper[16352]: I0307 21:24:47.848745 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-4-master-0" event={"ID":"72b4d517-f9c1-4fb2-9217-bd02b6838b07","Type":"ContainerDied","Data":"f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df"} Mar 07 21:24:47.848868 master-0 kubenswrapper[16352]: I0307 21:24:47.848784 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f45abd2e0704cacec7b591fdf1a81fae9a35aa9a429a48956b2209d5b72e79df" Mar 07 21:24:47.848868 master-0 kubenswrapper[16352]: I0307 21:24:47.848825 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-4-master-0" Mar 07 21:24:48.577079 master-0 kubenswrapper[16352]: I0307 21:24:48.576900 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:48.577079 master-0 kubenswrapper[16352]: I0307 21:24:48.576977 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:24:49.938841 master-0 kubenswrapper[16352]: E0307 21:24:49.938461 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="400ms" Mar 07 21:24:50.954227 master-0 kubenswrapper[16352]: E0307 21:24:50.954040 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:24:40Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:24:40Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:24:40Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:24:40Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:24:58.577492 master-0 kubenswrapper[16352]: I0307 21:24:58.577415 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:24:58.578394 master-0 kubenswrapper[16352]: I0307 21:24:58.577504 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:00.340151 master-0 kubenswrapper[16352]: E0307 21:25:00.340037 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="800ms" Mar 07 21:25:00.954802 master-0 kubenswrapper[16352]: E0307 21:25:00.954640 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:25:08.577325 master-0 kubenswrapper[16352]: I0307 21:25:08.577140 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:08.579143 master-0 kubenswrapper[16352]: I0307 21:25:08.577340 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:10.955405 master-0 kubenswrapper[16352]: E0307 21:25:10.955211 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:25:11.142909 master-0 kubenswrapper[16352]: E0307 21:25:11.142777 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="1.6s" Mar 07 21:25:16.675347 master-0 kubenswrapper[16352]: E0307 21:25:16.675043 16352 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189aac162fefdcbe openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Killing,Message:Stopping container etcd-metrics,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:23:34.617218238 +0000 UTC m=+337.687923337,LastTimestamp:2026-03-07 21:23:34.617218238 +0000 UTC m=+337.687923337,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:25:17.801014 master-0 kubenswrapper[16352]: E0307 21:25:17.800883 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 07 21:25:18.577439 master-0 kubenswrapper[16352]: I0307 21:25:18.577335 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:18.577439 master-0 kubenswrapper[16352]: I0307 21:25:18.577434 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:19.151047 master-0 kubenswrapper[16352]: I0307 21:25:19.150976 16352 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="11503158d4d3b5d452555eef540497642233db15253111a5df7cafd476717f7b" exitCode=0 Mar 07 21:25:19.152149 master-0 kubenswrapper[16352]: I0307 21:25:19.151037 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"11503158d4d3b5d452555eef540497642233db15253111a5df7cafd476717f7b"} Mar 07 21:25:19.152995 master-0 kubenswrapper[16352]: I0307 21:25:19.152921 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:25:19.152995 master-0 kubenswrapper[16352]: I0307 21:25:19.152986 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:25:20.956202 master-0 kubenswrapper[16352]: E0307 21:25:20.955628 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 21:25:22.766023 master-0 kubenswrapper[16352]: E0307 21:25:22.765855 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="3.2s" Mar 07 21:25:25.222298 master-0 kubenswrapper[16352]: I0307 21:25:25.222159 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" event={"ID":"fc392945-53ad-473c-8803-70e2026712d2","Type":"ContainerDied","Data":"d4b7300644150fe23cfc59508105971a56a432a4d87f592adbcc874823ecb22d"} Mar 07 21:25:25.223243 master-0 kubenswrapper[16352]: I0307 21:25:25.222001 16352 generic.go:334] "Generic (PLEG): container finished" podID="fc392945-53ad-473c-8803-70e2026712d2" containerID="d4b7300644150fe23cfc59508105971a56a432a4d87f592adbcc874823ecb22d" exitCode=0 Mar 07 21:25:25.223426 master-0 kubenswrapper[16352]: I0307 21:25:25.223356 16352 scope.go:117] "RemoveContainer" containerID="d4b7300644150fe23cfc59508105971a56a432a4d87f592adbcc874823ecb22d" Mar 07 21:25:26.237944 master-0 kubenswrapper[16352]: I0307 21:25:26.237383 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" event={"ID":"fc392945-53ad-473c-8803-70e2026712d2","Type":"ContainerStarted","Data":"e82978d2df1f8a70e0bbed847b9052850d5cc914b32dd53de5991b7b24c41ce5"} Mar 07 21:25:26.239021 master-0 kubenswrapper[16352]: I0307 21:25:26.238151 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:25:26.241211 master-0 kubenswrapper[16352]: I0307 21:25:26.241161 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-64bf9778cb-q7hrg" Mar 07 21:25:28.577575 master-0 kubenswrapper[16352]: I0307 21:25:28.577383 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:28.577575 master-0 kubenswrapper[16352]: I0307 21:25:28.577523 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:30.956777 master-0 kubenswrapper[16352]: E0307 21:25:30.956507 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:25:30.956777 master-0 kubenswrapper[16352]: E0307 21:25:30.956750 16352 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 21:25:32.301417 master-0 kubenswrapper[16352]: I0307 21:25:32.300362 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/cluster-cloud-controller-manager/0.log" Mar 07 21:25:32.301417 master-0 kubenswrapper[16352]: I0307 21:25:32.300582 16352 generic.go:334] "Generic (PLEG): container finished" podID="ca25117a-ccd5-4628-8342-e277bb7be0e2" containerID="7b4597e52188f5c573f68426c4f78eaba88e1097110bf160f774be6cf11820b0" exitCode=1 Mar 07 21:25:32.301417 master-0 kubenswrapper[16352]: I0307 21:25:32.300758 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerDied","Data":"7b4597e52188f5c573f68426c4f78eaba88e1097110bf160f774be6cf11820b0"} Mar 07 21:25:32.302324 master-0 kubenswrapper[16352]: I0307 21:25:32.302084 16352 scope.go:117] "RemoveContainer" containerID="7b4597e52188f5c573f68426c4f78eaba88e1097110bf160f774be6cf11820b0" Mar 07 21:25:33.313879 master-0 kubenswrapper[16352]: I0307 21:25:33.313804 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/cluster-cloud-controller-manager/0.log" Mar 07 21:25:33.314491 master-0 kubenswrapper[16352]: I0307 21:25:33.313928 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"1e6ae5d157f4787fcba9a36f69d742de07ddfaf5066b1f0cd4f66ce51c66c915"} Mar 07 21:25:34.329313 master-0 kubenswrapper[16352]: I0307 21:25:34.329201 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-mc2rc_290f6cf4-daa1-4cae-8e91-2411bf81f8b4/manager/0.log" Mar 07 21:25:34.330804 master-0 kubenswrapper[16352]: I0307 21:25:34.329959 16352 generic.go:334] "Generic (PLEG): container finished" podID="290f6cf4-daa1-4cae-8e91-2411bf81f8b4" containerID="229c457c12626388c83b801345b6a2d1fe3bbf16efee0d92665ab237bb56bee9" exitCode=1 Mar 07 21:25:34.330804 master-0 kubenswrapper[16352]: I0307 21:25:34.330071 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" event={"ID":"290f6cf4-daa1-4cae-8e91-2411bf81f8b4","Type":"ContainerDied","Data":"229c457c12626388c83b801345b6a2d1fe3bbf16efee0d92665ab237bb56bee9"} Mar 07 21:25:34.331022 master-0 kubenswrapper[16352]: I0307 21:25:34.330829 16352 scope.go:117] "RemoveContainer" containerID="229c457c12626388c83b801345b6a2d1fe3bbf16efee0d92665ab237bb56bee9" Mar 07 21:25:34.336056 master-0 kubenswrapper[16352]: I0307 21:25:34.333477 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/0.log" Mar 07 21:25:34.336056 master-0 kubenswrapper[16352]: I0307 21:25:34.333600 16352 generic.go:334] "Generic (PLEG): container finished" podID="7fa7b789-9201-493e-a96d-484a2622301a" containerID="cd505551260b0980137e293c3b0596c534dcce88209069b8f3c0dc90efac996d" exitCode=1 Mar 07 21:25:34.336056 master-0 kubenswrapper[16352]: I0307 21:25:34.333736 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerDied","Data":"cd505551260b0980137e293c3b0596c534dcce88209069b8f3c0dc90efac996d"} Mar 07 21:25:34.336056 master-0 kubenswrapper[16352]: I0307 21:25:34.334922 16352 scope.go:117] "RemoveContainer" containerID="cd505551260b0980137e293c3b0596c534dcce88209069b8f3c0dc90efac996d" Mar 07 21:25:34.338594 master-0 kubenswrapper[16352]: I0307 21:25:34.338411 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-mlxbw_183a5212-1b21-44e4-9ed5-2f63f76e652e/manager/0.log" Mar 07 21:25:34.338594 master-0 kubenswrapper[16352]: I0307 21:25:34.338484 16352 generic.go:334] "Generic (PLEG): container finished" podID="183a5212-1b21-44e4-9ed5-2f63f76e652e" containerID="8f10d93d20499c3da298c974d4861d544ee9c5bce59d8a7447d7dff84ed9c7bb" exitCode=1 Mar 07 21:25:34.338594 master-0 kubenswrapper[16352]: I0307 21:25:34.338528 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" event={"ID":"183a5212-1b21-44e4-9ed5-2f63f76e652e","Type":"ContainerDied","Data":"8f10d93d20499c3da298c974d4861d544ee9c5bce59d8a7447d7dff84ed9c7bb"} Mar 07 21:25:34.339138 master-0 kubenswrapper[16352]: I0307 21:25:34.339096 16352 scope.go:117] "RemoveContainer" containerID="8f10d93d20499c3da298c974d4861d544ee9c5bce59d8a7447d7dff84ed9c7bb" Mar 07 21:25:35.353153 master-0 kubenswrapper[16352]: I0307 21:25:35.353046 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/0.log" Mar 07 21:25:35.354324 master-0 kubenswrapper[16352]: I0307 21:25:35.353207 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerStarted","Data":"38182d771d7d222c5e351d85d35633bdc17c0d4be667b2ad4ec33c696539a4e8"} Mar 07 21:25:35.358668 master-0 kubenswrapper[16352]: I0307 21:25:35.358617 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-controller_operator-controller-controller-manager-6598bfb6c4-mlxbw_183a5212-1b21-44e4-9ed5-2f63f76e652e/manager/0.log" Mar 07 21:25:35.358963 master-0 kubenswrapper[16352]: I0307 21:25:35.358914 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" event={"ID":"183a5212-1b21-44e4-9ed5-2f63f76e652e","Type":"ContainerStarted","Data":"d0d774398f755e41e2ea8fdcd72a1806f6571d2b1a4e88d31454dc7cb117b33b"} Mar 07 21:25:35.359373 master-0 kubenswrapper[16352]: I0307 21:25:35.359288 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:25:35.363207 master-0 kubenswrapper[16352]: I0307 21:25:35.363161 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-catalogd_catalogd-controller-manager-7f8b8b6f4c-mc2rc_290f6cf4-daa1-4cae-8e91-2411bf81f8b4/manager/0.log" Mar 07 21:25:35.363867 master-0 kubenswrapper[16352]: I0307 21:25:35.363798 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" event={"ID":"290f6cf4-daa1-4cae-8e91-2411bf81f8b4","Type":"ContainerStarted","Data":"de3c4409a91b294718c93d4ceb02af6789da6a1403412ed836e93fcfef7f4594"} Mar 07 21:25:35.364419 master-0 kubenswrapper[16352]: I0307 21:25:35.364363 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:25:35.967663 master-0 kubenswrapper[16352]: E0307 21:25:35.967495 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="6.4s" Mar 07 21:25:38.577163 master-0 kubenswrapper[16352]: I0307 21:25:38.577091 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:38.578562 master-0 kubenswrapper[16352]: I0307 21:25:38.578521 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:40.291755 master-0 kubenswrapper[16352]: I0307 21:25:40.291647 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-catalogd/catalogd-controller-manager-7f8b8b6f4c-mc2rc" Mar 07 21:25:42.443041 master-0 kubenswrapper[16352]: I0307 21:25:42.442965 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/config-sync-controllers/0.log" Mar 07 21:25:42.443898 master-0 kubenswrapper[16352]: I0307 21:25:42.443666 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/cluster-cloud-controller-manager/0.log" Mar 07 21:25:42.443898 master-0 kubenswrapper[16352]: I0307 21:25:42.443748 16352 generic.go:334] "Generic (PLEG): container finished" podID="ca25117a-ccd5-4628-8342-e277bb7be0e2" containerID="dd4a42b20d0889c922e1f9c5f727fca2b250feadbb1f3cfd4fdc17a9825b9a9e" exitCode=1 Mar 07 21:25:42.443898 master-0 kubenswrapper[16352]: I0307 21:25:42.443785 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerDied","Data":"dd4a42b20d0889c922e1f9c5f727fca2b250feadbb1f3cfd4fdc17a9825b9a9e"} Mar 07 21:25:42.444548 master-0 kubenswrapper[16352]: I0307 21:25:42.444510 16352 scope.go:117] "RemoveContainer" containerID="dd4a42b20d0889c922e1f9c5f727fca2b250feadbb1f3cfd4fdc17a9825b9a9e" Mar 07 21:25:42.953314 master-0 kubenswrapper[16352]: I0307 21:25:42.953135 16352 status_manager.go:851] "Failed to get status for pod" podUID="4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" pod="openshift-kube-apiserver/installer-4-master-0" err="the server was unable to return a response in the time allotted, but may still be processing the request (get pods installer-4-master-0)" Mar 07 21:25:43.458723 master-0 kubenswrapper[16352]: I0307 21:25:43.458616 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/config-sync-controllers/0.log" Mar 07 21:25:43.459717 master-0 kubenswrapper[16352]: I0307 21:25:43.459619 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cloud-controller-manager-operator_cluster-cloud-controller-manager-operator-7c8df9b496-wp42j_ca25117a-ccd5-4628-8342-e277bb7be0e2/cluster-cloud-controller-manager/0.log" Mar 07 21:25:43.459831 master-0 kubenswrapper[16352]: I0307 21:25:43.459750 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cloud-controller-manager-operator/cluster-cloud-controller-manager-operator-7c8df9b496-wp42j" event={"ID":"ca25117a-ccd5-4628-8342-e277bb7be0e2","Type":"ContainerStarted","Data":"822ba3688876fdb33bfe3f68178bedf3d0ec3f4be7865beaefc90694f50ada4c"} Mar 07 21:25:45.398948 master-0 kubenswrapper[16352]: I0307 21:25:45.398837 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-controller/operator-controller-controller-manager-6598bfb6c4-mlxbw" Mar 07 21:25:48.576927 master-0 kubenswrapper[16352]: I0307 21:25:48.576814 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:48.576927 master-0 kubenswrapper[16352]: I0307 21:25:48.576923 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:25:50.679789 master-0 kubenswrapper[16352]: E0307 21:25:50.679550 16352 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event="&Event{ObjectMeta:{etcd-master-0.189aac162ff13cfa openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-master-0,UID:8e52bef89f4b50e4590a1719bcc5d7e5,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Killing,Message:Stopping container etcd,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:23:34.61730841 +0000 UTC m=+337.688013509,LastTimestamp:2026-03-07 21:23:34.61730841 +0000 UTC m=+337.688013509,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:25:51.233070 master-0 kubenswrapper[16352]: E0307 21:25:51.232876 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:25:41Z\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:25:41Z\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:25:41Z\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-07T21:25:41Z\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3e7365fa46219476560dd59d3a82f041546a33f0935c57eb4f3274ab3118ef0b\\\"],\\\"sizeBytes\\\":2895821940},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:ae042a5d32eb2f18d537f2068849e665b55df7d8360daedaaeea98bd2a79e769\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:d077bbabe6cb885ed229119008480493e8364e4bfddaa00b099f68c52b016e6b\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1733328350},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:82f121f9d021a9843b9458f9f222c40f292f2c21dcfcf00f05daacaca8a949c0\\\"],\\\"sizeBytes\\\":1637445817},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:063b8972231e65eb43f6545ba37804f68138dc54d97b91a652a1c5bc7dc76aa5\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:cf682d23b2857e455609879a0867d171a221c18e2cec995dd79570b77c5a4705\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1272201949},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:381e96959e3c3b08a3e2715e6024697ae14af31bd0378b49f583e984b3b9a192\\\"],\\\"sizeBytes\\\":1238047254},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:e0c034ae18daa01af8d073f8cc24ae4af87883c664304910eab1167fdfd60c0b\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:ef0c6b9e405f7a452211e063ce07ded04ccbe38b53860bfd71b5a7cd5072830a\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1229556414},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:79984dfbdf9aeae3985c7fd7515e12328775c0e7fc4782929d0998f4dd2a87c6\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:7be89499615ec913d0fe40ca89682080a3f1181a066dbc501c877cc7ccbcc9ae\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1220167376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c9330c756dd6ab107e9a4b671bc52742c90d5be11a8380d8b710e2bd4e0ed43c\\\"],\\\"sizeBytes\\\":992610645},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fca00eb71b1f03e5b5180a66f3871f5626d337b56196622f5842cfc165523b4\\\"],\\\"sizeBytes\\\":943837171},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff40e33e63d6c1f4e4393d5506e38def25ba20582d980fec8b81f81c867ceeec\\\"],\\\"sizeBytes\\\":918278686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:042e6a37747405da54cf91543d44408c9531327a2cce653c41ca851aa7c896d8\\\"],\\\"sizeBytes\\\":880378279},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e207c762b7802ee0e54507d21ed1f25b19eddc511a4b824934c16c163193be6a\\\"],\\\"sizeBytes\\\":876146500},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:41dbd66e9a886c1fd7a99752f358c6125a209e83c0dd37b35730baae58d82ee8\\\"],\\\"sizeBytes\\\":862633255},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2508a5f66e509e813cb09825b5456be91b4cdd4d02f470f22a33de42c753f2b7\\\"],\\\"sizeBytes\\\":862197440},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bfcd8017eede3fb66fa3f5b47c27508b787d38455689154461f0e6a5dc303ff\\\"],\\\"sizeBytes\\\":772939850},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9c946fdc5a4cd16ff998c17844780e7efc38f7f38b97a8a40d75cd77b318ddef\\\"],\\\"sizeBytes\\\":687947017},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:0c03cb25dc6f6a865529ebc979e8d7d08492b28fd3fb93beddf30e1cb06f1245\\\"],\\\"sizeBytes\\\":683169303},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3f34dc492c80a3dee4643cc2291044750ac51e6e919b973de8723fa8b70bde70\\\"],\\\"sizeBytes\\\":677929075},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:db06a0e0308b2e541c7bb2d11517431abb31133b2ce6cb6c34ecf5ef4188a4e8\\\"],\\\"sizeBytes\\\":633876767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a149ed17b20a7577fceacfc5198f8b7b3edf314ee22f77bd6ab87f06a3aa17f3\\\"],\\\"sizeBytes\\\":621647686},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3cdb019b6769514c0e92ef92da73e914fbcf6254cc919677ee077c93ce324de0\\\"],\\\"sizeBytes\\\":605698200},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1ec9d3dbcc6f9817c0f6d09f64c0d98c91b03afbb1fcb3c1e1718aca900754b\\\"],\\\"sizeBytes\\\":589379637},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1575be013a898f153cbf012aeaf28ce720022f934dc05bdffbe479e30999d460\\\"],\\\"sizeBytes\\\":582153879},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:eb82e437a701ce83b70e56be8477d987da67578714dda3d9fa6628804b1b56f5\\\"],\\\"sizeBytes\\\":558210153},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d470dba32064cc62b2ab29303d6e00612304548262eaa2f4e5b40a00a26f71ce\\\"],\\\"sizeBytes\\\":557426734},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:28f33d62fd0b94c5ea0ebcd7a4216848c8dd671a38d901ce98f4c399b700e1c7\\\"],\\\"sizeBytes\\\":548751793},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:cc20748723f55f960cfb6328d1591880bbd1b3452155633996d4f41fc7c5f46b\\\"],\\\"sizeBytes\\\":529324693},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ac6f0695d3386e6d601f4ae507940981352fa3ad884b0fed6fb25698c5e6f916\\\"],\\\"sizeBytes\\\":528946249},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:6088910bdc1583b275fab261e3234c0b63b4cc16d01bcea697b6a7f6db13bdf3\\\"],\\\"sizeBytes\\\":518384455},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-release@sha256:14bd3c04daa885009785d48f4973e2890751a7ec116cc14d17627245cda54d7b\\\"],\\\"sizeBytes\\\":517997625},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\\\"],\\\"sizeBytes\\\":514980169},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd0b71d620cf0acbfcd1b58797dc30050bd167cb6b7a7f62c8333dd370c76d5\\\"],\\\"sizeBytes\\\":513581866},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9bd818e37e1f9dbe5393c557b89e81010d68171408e0e4157a3d92ae0ca1c953\\\"],\\\"sizeBytes\\\":513220825},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d601c8437b4d8bbe2da0f3b08f1bd8693f5a4ef6d835377ec029c79d9dca5dab\\\"],\\\"sizeBytes\\\":512273539},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ca868abfecbf9a9c414a4c79e57c4c55e62c8a6796f899ba59dde86c4cf4bb\\\"],\\\"sizeBytes\\\":512235767},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1b47d2b146e833bc1612a652136f43afcf1ba30f32cbd0a2f06ca9fc80d969f0\\\"],\\\"sizeBytes\\\":511226810},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:834063dd26fb3d2489e193489198a0d5fbe9c775a0e30173e5fcef6994fbf0f6\\\"],\\\"sizeBytes\\\":511164376},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee46e13e26156c904e5784e2d64511021ed0974a169ccd6476b05bff1c44ec56\\\"],\\\"sizeBytes\\\":508888174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7220d16ea511c0f0410cf45db45aaafcc64847c9cb5732ad1eff39ceb482cdba\\\"],\\\"sizeBytes\\\":508544235},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:526c5c02a8fa86a2fa83a7087d4a5c4b1c4072c0f3906163494cc3b3c1295e9b\\\"],\\\"sizeBytes\\\":507967997},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:4010a8f9d932615336227e2fd43325d4fa9025dca4bebe032106efea733fcfc3\\\"],\\\"sizeBytes\\\":506479655},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76b719f5bd541eb1a8bae124d650896b533e7bc3107be536e598b3ab4e135282\\\"],\\\"sizeBytes\\\":506394574},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5de69354d08184ecd6144facc1461777674674e8304971216d4cf1a5025472b9\\\"],\\\"sizeBytes\\\":505344964},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a324f47cf789c0480fa4bcb0812152abc3cd844318bab193108fe4349eed609\\\"],\\\"sizeBytes\\\":505242594},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b8cb5e0caeca0fb02f3e8c72b7ddf1c49e3c602e42e119ba30c60525f1db1821\\\"],\\\"sizeBytes\\\":504658657},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d11f13e867f4df046ca6789bb7273da5d0c08895b3dea00949c8a5458f9e22f9\\\"],\\\"sizeBytes\\\":504623546},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:2c8f904c1084450856b501d40bbc9246265fe34a2b70efec23541e3285da7f88\\\"],\\\"sizeBytes\\\":502712961},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:76bdc35338c4d0f5e5b9448fb73e3578656f908a962286692e12a0372ec721d5\\\"],\\\"sizeBytes\\\":495994161},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ff2db11ce277288befab25ddb86177e832842d2edb5607a2da8f252a030e1cfc\\\"],\\\"sizeBytes\\\":495064829},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9b2e765b795c30c910c331c85226e5db0d56463b6c81d79ded739cba76e2b032\\\"],\\\"sizeBytes\\\":487151732}]}}\" for node \"master-0\": Patch \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0/status?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:25:52.370098 master-0 kubenswrapper[16352]: E0307 21:25:52.369918 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:25:53.157321 master-0 kubenswrapper[16352]: E0307 21:25:53.157233 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 07 21:25:53.572582 master-0 kubenswrapper[16352]: I0307 21:25:53.572493 16352 generic.go:334] "Generic (PLEG): container finished" podID="29c709c82970b529e7b9b895aa92ef05" containerID="f9662bba282516d4beebb4d8de068752f0856705f426ad4736bd452078f9a6c3" exitCode=0 Mar 07 21:25:53.573332 master-0 kubenswrapper[16352]: I0307 21:25:53.572595 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerDied","Data":"f9662bba282516d4beebb4d8de068752f0856705f426ad4736bd452078f9a6c3"} Mar 07 21:25:53.573332 master-0 kubenswrapper[16352]: I0307 21:25:53.573057 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:25:53.573332 master-0 kubenswrapper[16352]: I0307 21:25:53.573079 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:25:56.606541 master-0 kubenswrapper[16352]: I0307 21:25:56.606475 16352 generic.go:334] "Generic (PLEG): container finished" podID="2957024f-9646-499f-913c-90b81f01eecd" containerID="d1098107652edaefa736d099e7020ce79c96c3a73438a03277842e63addc39cd" exitCode=0 Mar 07 21:25:56.607754 master-0 kubenswrapper[16352]: I0307 21:25:56.606580 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" event={"ID":"2957024f-9646-499f-913c-90b81f01eecd","Type":"ContainerDied","Data":"d1098107652edaefa736d099e7020ce79c96c3a73438a03277842e63addc39cd"} Mar 07 21:25:56.608515 master-0 kubenswrapper[16352]: I0307 21:25:56.608495 16352 scope.go:117] "RemoveContainer" containerID="d1098107652edaefa736d099e7020ce79c96c3a73438a03277842e63addc39cd" Mar 07 21:25:57.621365 master-0 kubenswrapper[16352]: I0307 21:25:57.621263 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" event={"ID":"2957024f-9646-499f-913c-90b81f01eecd","Type":"ContainerStarted","Data":"94af482fcc2d9b3c471844df80b1404a92106dc3a9a8ce2518ad4d31d740f2b6"} Mar 07 21:25:57.622452 master-0 kubenswrapper[16352]: I0307 21:25:57.621864 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:25:57.628350 master-0 kubenswrapper[16352]: I0307 21:25:57.628271 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-68f988879c-j2dj6" Mar 07 21:25:58.577876 master-0 kubenswrapper[16352]: I0307 21:25:58.577779 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:25:58.578199 master-0 kubenswrapper[16352]: I0307 21:25:58.577892 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:01.234428 master-0 kubenswrapper[16352]: E0307 21:26:01.234246 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:02.688567 master-0 kubenswrapper[16352]: I0307 21:26:02.688505 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler/0.log" Mar 07 21:26:02.690809 master-0 kubenswrapper[16352]: I0307 21:26:02.690739 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="75482995cc5f55d9d7fb4b8a57bf5ec36cbaac14083b2719abeb4a1eb62846bc" exitCode=1 Mar 07 21:26:02.690809 master-0 kubenswrapper[16352]: I0307 21:26:02.690824 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerDied","Data":"75482995cc5f55d9d7fb4b8a57bf5ec36cbaac14083b2719abeb4a1eb62846bc"} Mar 07 21:26:02.691770 master-0 kubenswrapper[16352]: I0307 21:26:02.691725 16352 scope.go:117] "RemoveContainer" containerID="75482995cc5f55d9d7fb4b8a57bf5ec36cbaac14083b2719abeb4a1eb62846bc" Mar 07 21:26:03.704650 master-0 kubenswrapper[16352]: I0307 21:26:03.704448 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler/0.log" Mar 07 21:26:03.705620 master-0 kubenswrapper[16352]: I0307 21:26:03.705216 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1d3d45b6ce1b3764f9927e623a71adf8","Type":"ContainerStarted","Data":"49cc11a235efe78997a02668cffbda8c251aec39c02e2f7118030908ead8c408"} Mar 07 21:26:03.706479 master-0 kubenswrapper[16352]: I0307 21:26:03.706421 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:26:03.713157 master-0 kubenswrapper[16352]: I0307 21:26:03.713107 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:26:03.713284 master-0 kubenswrapper[16352]: I0307 21:26:03.713191 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="2c0128d80fea5834fc0f12bf23cdfbeeabbf5c415717881d7c9c6db472d9dd3f" exitCode=0 Mar 07 21:26:03.713284 master-0 kubenswrapper[16352]: I0307 21:26:03.713234 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerDied","Data":"2c0128d80fea5834fc0f12bf23cdfbeeabbf5c415717881d7c9c6db472d9dd3f"} Mar 07 21:26:03.713977 master-0 kubenswrapper[16352]: I0307 21:26:03.713934 16352 scope.go:117] "RemoveContainer" containerID="2c0128d80fea5834fc0f12bf23cdfbeeabbf5c415717881d7c9c6db472d9dd3f" Mar 07 21:26:04.728406 master-0 kubenswrapper[16352]: I0307 21:26:04.728224 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:26:04.729069 master-0 kubenswrapper[16352]: I0307 21:26:04.728555 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5"} Mar 07 21:26:04.732563 master-0 kubenswrapper[16352]: I0307 21:26:04.732499 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/1.log" Mar 07 21:26:04.733481 master-0 kubenswrapper[16352]: I0307 21:26:04.733429 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/0.log" Mar 07 21:26:04.733604 master-0 kubenswrapper[16352]: I0307 21:26:04.733557 16352 generic.go:334] "Generic (PLEG): container finished" podID="7fa7b789-9201-493e-a96d-484a2622301a" containerID="38182d771d7d222c5e351d85d35633bdc17c0d4be667b2ad4ec33c696539a4e8" exitCode=1 Mar 07 21:26:04.733776 master-0 kubenswrapper[16352]: I0307 21:26:04.733715 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerDied","Data":"38182d771d7d222c5e351d85d35633bdc17c0d4be667b2ad4ec33c696539a4e8"} Mar 07 21:26:04.733897 master-0 kubenswrapper[16352]: I0307 21:26:04.733807 16352 scope.go:117] "RemoveContainer" containerID="cd505551260b0980137e293c3b0596c534dcce88209069b8f3c0dc90efac996d" Mar 07 21:26:04.734518 master-0 kubenswrapper[16352]: I0307 21:26:04.734477 16352 scope.go:117] "RemoveContainer" containerID="38182d771d7d222c5e351d85d35633bdc17c0d4be667b2ad4ec33c696539a4e8" Mar 07 21:26:04.734998 master-0 kubenswrapper[16352]: E0307 21:26:04.734934 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"snapshot-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=snapshot-controller pod=csi-snapshot-controller-7577d6f48-kzjmp_openshift-cluster-storage-operator(7fa7b789-9201-493e-a96d-484a2622301a)\"" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" podUID="7fa7b789-9201-493e-a96d-484a2622301a" Mar 07 21:26:05.744877 master-0 kubenswrapper[16352]: I0307 21:26:05.744795 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/1.log" Mar 07 21:26:07.770430 master-0 kubenswrapper[16352]: I0307 21:26:07.770333 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-bbz7l_e3fe386a-dea8-484a-b95a-0f3f475b1f82/machine-approver-controller/0.log" Mar 07 21:26:07.771318 master-0 kubenswrapper[16352]: I0307 21:26:07.771230 16352 generic.go:334] "Generic (PLEG): container finished" podID="e3fe386a-dea8-484a-b95a-0f3f475b1f82" containerID="e1ff3eae48c3ff1b893f6264aeabf20e527e5a2aada9c5ff0d41a2697e563623" exitCode=255 Mar 07 21:26:07.771398 master-0 kubenswrapper[16352]: I0307 21:26:07.771292 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" event={"ID":"e3fe386a-dea8-484a-b95a-0f3f475b1f82","Type":"ContainerDied","Data":"e1ff3eae48c3ff1b893f6264aeabf20e527e5a2aada9c5ff0d41a2697e563623"} Mar 07 21:26:07.772244 master-0 kubenswrapper[16352]: I0307 21:26:07.772187 16352 scope.go:117] "RemoveContainer" containerID="e1ff3eae48c3ff1b893f6264aeabf20e527e5a2aada9c5ff0d41a2697e563623" Mar 07 21:26:07.774792 master-0 kubenswrapper[16352]: I0307 21:26:07.774718 16352 generic.go:334] "Generic (PLEG): container finished" podID="46548c2c-6a8a-4382-87de-2c7a8442a33c" containerID="fd701e4ed1aac8c9685fae0f60e9ef450afe5e5e84030884d9be44f37a388515" exitCode=0 Mar 07 21:26:07.775027 master-0 kubenswrapper[16352]: I0307 21:26:07.774800 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" event={"ID":"46548c2c-6a8a-4382-87de-2c7a8442a33c","Type":"ContainerDied","Data":"fd701e4ed1aac8c9685fae0f60e9ef450afe5e5e84030884d9be44f37a388515"} Mar 07 21:26:07.776006 master-0 kubenswrapper[16352]: I0307 21:26:07.775939 16352 scope.go:117] "RemoveContainer" containerID="fd701e4ed1aac8c9685fae0f60e9ef450afe5e5e84030884d9be44f37a388515" Mar 07 21:26:07.778998 master-0 kubenswrapper[16352]: I0307 21:26:07.778911 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/0.log" Mar 07 21:26:07.778998 master-0 kubenswrapper[16352]: I0307 21:26:07.778981 16352 generic.go:334] "Generic (PLEG): container finished" podID="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" containerID="8ba80cee3d89d3d6b976aac0e2f007c4e112b08741be4d9e1220847381797dab" exitCode=1 Mar 07 21:26:07.779159 master-0 kubenswrapper[16352]: I0307 21:26:07.779018 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerDied","Data":"8ba80cee3d89d3d6b976aac0e2f007c4e112b08741be4d9e1220847381797dab"} Mar 07 21:26:07.779617 master-0 kubenswrapper[16352]: I0307 21:26:07.779564 16352 scope.go:117] "RemoveContainer" containerID="8ba80cee3d89d3d6b976aac0e2f007c4e112b08741be4d9e1220847381797dab" Mar 07 21:26:08.577397 master-0 kubenswrapper[16352]: I0307 21:26:08.577299 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:08.577397 master-0 kubenswrapper[16352]: I0307 21:26:08.577367 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:08.792755 master-0 kubenswrapper[16352]: I0307 21:26:08.792632 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-machine-approver_machine-approver-754bdc9f9d-bbz7l_e3fe386a-dea8-484a-b95a-0f3f475b1f82/machine-approver-controller/0.log" Mar 07 21:26:08.794020 master-0 kubenswrapper[16352]: I0307 21:26:08.793386 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-754bdc9f9d-bbz7l" event={"ID":"e3fe386a-dea8-484a-b95a-0f3f475b1f82","Type":"ContainerStarted","Data":"b0d16ef76d22442002a64054c6129e6bced5d233bd3545a603d071f1da09f893"} Mar 07 21:26:08.797103 master-0 kubenswrapper[16352]: I0307 21:26:08.796992 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-66b55d57d-mc46k" event={"ID":"46548c2c-6a8a-4382-87de-2c7a8442a33c","Type":"ContainerStarted","Data":"e93194c038ab767f072ee692c7ca27410448ec98bba11ff2e800af07d68ab59e"} Mar 07 21:26:08.800320 master-0 kubenswrapper[16352]: I0307 21:26:08.800268 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/0.log" Mar 07 21:26:08.800459 master-0 kubenswrapper[16352]: I0307 21:26:08.800371 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerStarted","Data":"b875c3b239d78b830a0dac3a95e202214895e5565deba372072b56d6eaf81864"} Mar 07 21:26:09.371968 master-0 kubenswrapper[16352]: E0307 21:26:09.371780 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:26:11.234979 master-0 kubenswrapper[16352]: E0307 21:26:11.234888 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:12.509179 master-0 kubenswrapper[16352]: I0307 21:26:12.509100 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:26:12.510339 master-0 kubenswrapper[16352]: I0307 21:26:12.510298 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:26:15.509476 master-0 kubenswrapper[16352]: I0307 21:26:15.509299 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:26:15.510072 master-0 kubenswrapper[16352]: I0307 21:26:15.509592 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:15.902596 master-0 kubenswrapper[16352]: I0307 21:26:15.902535 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/control-plane-machine-set-operator/0.log" Mar 07 21:26:15.903075 master-0 kubenswrapper[16352]: I0307 21:26:15.903033 16352 generic.go:334] "Generic (PLEG): container finished" podID="1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c" containerID="27a84f3840d7bd704fbc6124aef0ebf7f4eef91692b179d35440231e945dc9fe" exitCode=1 Mar 07 21:26:15.903316 master-0 kubenswrapper[16352]: I0307 21:26:15.903125 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" event={"ID":"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c","Type":"ContainerDied","Data":"27a84f3840d7bd704fbc6124aef0ebf7f4eef91692b179d35440231e945dc9fe"} Mar 07 21:26:15.904237 master-0 kubenswrapper[16352]: I0307 21:26:15.904208 16352 scope.go:117] "RemoveContainer" containerID="27a84f3840d7bd704fbc6124aef0ebf7f4eef91692b179d35440231e945dc9fe" Mar 07 21:26:15.906440 master-0 kubenswrapper[16352]: I0307 21:26:15.905672 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-kr9ft_e720291b-0f96-4ebb-80f2-5df7cb194ffc/package-server-manager/0.log" Mar 07 21:26:15.906519 master-0 kubenswrapper[16352]: I0307 21:26:15.906438 16352 generic.go:334] "Generic (PLEG): container finished" podID="e720291b-0f96-4ebb-80f2-5df7cb194ffc" containerID="768e8856043fbb67b776885a9d2a7eceeb5d345ca9e38c33950ec9b98b1495c0" exitCode=1 Mar 07 21:26:15.906586 master-0 kubenswrapper[16352]: I0307 21:26:15.906496 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" event={"ID":"e720291b-0f96-4ebb-80f2-5df7cb194ffc","Type":"ContainerDied","Data":"768e8856043fbb67b776885a9d2a7eceeb5d345ca9e38c33950ec9b98b1495c0"} Mar 07 21:26:15.907409 master-0 kubenswrapper[16352]: I0307 21:26:15.907352 16352 scope.go:117] "RemoveContainer" containerID="768e8856043fbb67b776885a9d2a7eceeb5d345ca9e38c33950ec9b98b1495c0" Mar 07 21:26:16.921980 master-0 kubenswrapper[16352]: I0307 21:26:16.921878 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-6686554ddc-dgjgz_1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c/control-plane-machine-set-operator/0.log" Mar 07 21:26:16.922596 master-0 kubenswrapper[16352]: I0307 21:26:16.922074 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-6686554ddc-dgjgz" event={"ID":"1ba27b7c-a93d-4d6e-a8f2-ec15903dd00c","Type":"ContainerStarted","Data":"ca8b7feb188472478b1af513de3007ae79b841f3131122713d99820f50a1feaf"} Mar 07 21:26:16.926105 master-0 kubenswrapper[16352]: I0307 21:26:16.926045 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-operator-lifecycle-manager_package-server-manager-854648ff6d-kr9ft_e720291b-0f96-4ebb-80f2-5df7cb194ffc/package-server-manager/0.log" Mar 07 21:26:16.926807 master-0 kubenswrapper[16352]: I0307 21:26:16.926652 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" event={"ID":"e720291b-0f96-4ebb-80f2-5df7cb194ffc","Type":"ContainerStarted","Data":"c0957ef66d52f4b593e9febd72837b152b673046ef03420e4bf21a72cfb04227"} Mar 07 21:26:16.927933 master-0 kubenswrapper[16352]: I0307 21:26:16.927317 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:26:18.577379 master-0 kubenswrapper[16352]: I0307 21:26:18.577249 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:18.577379 master-0 kubenswrapper[16352]: I0307 21:26:18.577358 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:20.190730 master-0 kubenswrapper[16352]: I0307 21:26:20.190589 16352 scope.go:117] "RemoveContainer" containerID="38182d771d7d222c5e351d85d35633bdc17c0d4be667b2ad4ec33c696539a4e8" Mar 07 21:26:20.992360 master-0 kubenswrapper[16352]: I0307 21:26:20.992255 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_csi-snapshot-controller-7577d6f48-kzjmp_7fa7b789-9201-493e-a96d-484a2622301a/snapshot-controller/1.log" Mar 07 21:26:20.992651 master-0 kubenswrapper[16352]: I0307 21:26:20.992370 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-7577d6f48-kzjmp" event={"ID":"7fa7b789-9201-493e-a96d-484a2622301a","Type":"ContainerStarted","Data":"5236288800aaf0ef94e1dec34bbec012de9642d18495b63f94a2ba6ed47595c7"} Mar 07 21:26:21.235636 master-0 kubenswrapper[16352]: E0307 21:26:21.235513 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:24.692882 master-0 kubenswrapper[16352]: E0307 21:26:24.691807 16352 event.go:359] "Server rejected event (will not retry!)" err="Timeout: request did not complete within requested timeout - context deadline exceeded" event=< Mar 07 21:26:24.692882 master-0 kubenswrapper[16352]: &Event{ObjectMeta:{console-64d844fb5f-9b28j.189aac1273f6da3c openshift-console 14463 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-console,Name:console-64d844fb5f-9b28j,UID:253bb615-1b60-4112-aee8-f572d1c84114,APIVersion:v1,ResourceVersion:13893,FieldPath:spec.containers{console},},Reason:ProbeError,Message:Startup probe error: Get "https://10.128.0.91:8443/health": dial tcp 10.128.0.91:8443: connect: connection refused Mar 07 21:26:24.692882 master-0 kubenswrapper[16352]: body: Mar 07 21:26:24.692882 master-0 kubenswrapper[16352]: ,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:23:18 +0000 UTC,LastTimestamp:2026-03-07 21:23:38.577482362 +0000 UTC m=+341.648187421,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,} Mar 07 21:26:24.692882 master-0 kubenswrapper[16352]: > Mar 07 21:26:25.508645 master-0 kubenswrapper[16352]: I0307 21:26:25.508519 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:26:25.509358 master-0 kubenswrapper[16352]: I0307 21:26:25.508677 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:26.373769 master-0 kubenswrapper[16352]: E0307 21:26:26.373629 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:26:27.577101 master-0 kubenswrapper[16352]: E0307 21:26:27.576974 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Timeout: request did not complete within requested timeout - context deadline exceeded" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:28.067813 master-0 kubenswrapper[16352]: I0307 21:26:28.067732 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"26fffb0f20c30e50bc8b4d0f4f4d97a3b6bf44eac79454a203ec3d07d43a8c26"} Mar 07 21:26:28.577334 master-0 kubenswrapper[16352]: I0307 21:26:28.577241 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:28.592192 master-0 kubenswrapper[16352]: I0307 21:26:28.577325 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:29.086008 master-0 kubenswrapper[16352]: I0307 21:26:29.085944 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"2f29bade6a6e847fd3a683721f3d756047c9e8b1613ac6ae420afb53f374a7c2"} Mar 07 21:26:29.086237 master-0 kubenswrapper[16352]: I0307 21:26:29.086030 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"d1a079280ef9957e2263a060b371c5f088ce400963d19ba7bdfc7ce81f015c24"} Mar 07 21:26:29.086237 master-0 kubenswrapper[16352]: I0307 21:26:29.086046 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"7466c5415315435cd6e3f75f1043b5b073a0dfa2a43c540bc87cc265114e3c60"} Mar 07 21:26:30.108125 master-0 kubenswrapper[16352]: I0307 21:26:30.108020 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-master-0" event={"ID":"29c709c82970b529e7b9b895aa92ef05","Type":"ContainerStarted","Data":"e8b5ddd1757be0cd3df7671b13f6e9a8691ce55c61806947defda8f8ddd23109"} Mar 07 21:26:30.108791 master-0 kubenswrapper[16352]: I0307 21:26:30.108663 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:26:30.108791 master-0 kubenswrapper[16352]: I0307 21:26:30.108742 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-etcd/etcd-master-0" podUID="5a25f683-c9d1-4eee-88bd-2ce05cb77548" Mar 07 21:26:31.237000 master-0 kubenswrapper[16352]: E0307 21:26:31.236932 16352 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"master-0\": Get \"https://api-int.sno.openstack.lab:6443/api/v1/nodes/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:31.238156 master-0 kubenswrapper[16352]: E0307 21:26:31.237998 16352 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 07 21:26:33.229193 master-0 kubenswrapper[16352]: I0307 21:26:33.229080 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:33.229193 master-0 kubenswrapper[16352]: I0307 21:26:33.229155 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:34.385084 master-0 kubenswrapper[16352]: I0307 21:26:34.384945 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:49764->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 07 21:26:34.386153 master-0 kubenswrapper[16352]: I0307 21:26:34.385104 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:49764->127.0.0.1:10357: read: connection reset by peer" Mar 07 21:26:34.386153 master-0 kubenswrapper[16352]: I0307 21:26:34.385229 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:26:34.387582 master-0 kubenswrapper[16352]: I0307 21:26:34.386757 16352 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 21:26:34.387582 master-0 kubenswrapper[16352]: I0307 21:26:34.386918 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" containerID="cri-o://829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5" gracePeriod=30 Mar 07 21:26:35.175608 master-0 kubenswrapper[16352]: I0307 21:26:35.175508 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/1.log" Mar 07 21:26:35.179257 master-0 kubenswrapper[16352]: I0307 21:26:35.179207 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:26:35.179514 master-0 kubenswrapper[16352]: I0307 21:26:35.179475 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5" exitCode=255 Mar 07 21:26:35.179653 master-0 kubenswrapper[16352]: I0307 21:26:35.179601 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerDied","Data":"829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5"} Mar 07 21:26:35.179821 master-0 kubenswrapper[16352]: I0307 21:26:35.179726 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637"} Mar 07 21:26:35.179821 master-0 kubenswrapper[16352]: I0307 21:26:35.179769 16352 scope.go:117] "RemoveContainer" containerID="2c0128d80fea5834fc0f12bf23cdfbeeabbf5c415717881d7c9c6db472d9dd3f" Mar 07 21:26:36.194918 master-0 kubenswrapper[16352]: I0307 21:26:36.194821 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/1.log" Mar 07 21:26:36.197854 master-0 kubenswrapper[16352]: I0307 21:26:36.197792 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:26:38.576873 master-0 kubenswrapper[16352]: I0307 21:26:38.576740 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:38.577939 master-0 kubenswrapper[16352]: I0307 21:26:38.576869 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:41.760881 master-0 kubenswrapper[16352]: I0307 21:26:41.760793 16352 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:41.841396 master-0 kubenswrapper[16352]: I0307 21:26:41.841310 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:26:41.846899 master-0 kubenswrapper[16352]: I0307 21:26:41.846803 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:26:42.508723 master-0 kubenswrapper[16352]: I0307 21:26:42.508616 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:26:42.508723 master-0 kubenswrapper[16352]: I0307 21:26:42.508716 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:26:43.262760 master-0 kubenswrapper[16352]: I0307 21:26:43.262639 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:43.375065 master-0 kubenswrapper[16352]: E0307 21:26:43.374951 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:26:45.509643 master-0 kubenswrapper[16352]: I0307 21:26:45.509521 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:26:45.510802 master-0 kubenswrapper[16352]: I0307 21:26:45.509664 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:47.278413 master-0 kubenswrapper[16352]: I0307 21:26:47.278333 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-854648ff6d-kr9ft" Mar 07 21:26:48.250550 master-0 kubenswrapper[16352]: I0307 21:26:48.250433 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:48.577133 master-0 kubenswrapper[16352]: I0307 21:26:48.576951 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:48.577133 master-0 kubenswrapper[16352]: I0307 21:26:48.577028 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:26:50.916515 master-0 kubenswrapper[16352]: I0307 21:26:50.916403 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:26:54.771155 master-0 kubenswrapper[16352]: E0307 21:26:54.771020 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:26:55.509333 master-0 kubenswrapper[16352]: I0307 21:26:55.509177 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:26:55.509718 master-0 kubenswrapper[16352]: I0307 21:26:55.509336 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:26:58.577293 master-0 kubenswrapper[16352]: I0307 21:26:58.577087 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:26:58.577293 master-0 kubenswrapper[16352]: I0307 21:26:58.577192 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:00.377752 master-0 kubenswrapper[16352]: E0307 21:27:00.377288 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:27:05.206149 master-0 kubenswrapper[16352]: I0307 21:27:05.204442 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:48622->127.0.0.1:10357: read: connection reset by peer" start-of-body= Mar 07 21:27:05.206149 master-0 kubenswrapper[16352]: I0307 21:27:05.204637 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": read tcp 127.0.0.1:48622->127.0.0.1:10357: read: connection reset by peer" Mar 07 21:27:05.222793 master-0 kubenswrapper[16352]: I0307 21:27:05.219172 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:27:05.222793 master-0 kubenswrapper[16352]: I0307 21:27:05.220477 16352 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 21:27:05.222793 master-0 kubenswrapper[16352]: I0307 21:27:05.220628 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" containerID="cri-o://3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637" gracePeriod=30 Mar 07 21:27:05.524285 master-0 kubenswrapper[16352]: I0307 21:27:05.524183 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/2.log" Mar 07 21:27:05.524896 master-0 kubenswrapper[16352]: I0307 21:27:05.524801 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/1.log" Mar 07 21:27:05.526832 master-0 kubenswrapper[16352]: I0307 21:27:05.526770 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:27:05.526954 master-0 kubenswrapper[16352]: I0307 21:27:05.526850 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637" exitCode=255 Mar 07 21:27:05.526954 master-0 kubenswrapper[16352]: I0307 21:27:05.526904 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerDied","Data":"3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637"} Mar 07 21:27:05.527099 master-0 kubenswrapper[16352]: I0307 21:27:05.526958 16352 scope.go:117] "RemoveContainer" containerID="829492ba66d84d478724b4cb80e7d98e492e75ee30e266dab8b875184c6d07e5" Mar 07 21:27:06.542561 master-0 kubenswrapper[16352]: I0307 21:27:06.542448 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/2.log" Mar 07 21:27:06.546024 master-0 kubenswrapper[16352]: I0307 21:27:06.545947 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:27:06.546182 master-0 kubenswrapper[16352]: I0307 21:27:06.546085 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} Mar 07 21:27:08.425242 master-0 kubenswrapper[16352]: E0307 21:27:08.425067 16352 kubelet.go:1929] "Failed creating a mirror pod for" err="Internal error occurred: admission plugin \"LimitRanger\" failed to complete mutation in 13s" pod="openshift-etcd/etcd-master-0" Mar 07 21:27:08.572728 master-0 kubenswrapper[16352]: I0307 21:27:08.572633 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/1.log" Mar 07 21:27:08.577385 master-0 kubenswrapper[16352]: I0307 21:27:08.577273 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:08.577595 master-0 kubenswrapper[16352]: I0307 21:27:08.577400 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:08.582211 master-0 kubenswrapper[16352]: I0307 21:27:08.582149 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/0.log" Mar 07 21:27:08.582362 master-0 kubenswrapper[16352]: I0307 21:27:08.582243 16352 generic.go:334] "Generic (PLEG): container finished" podID="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" containerID="b875c3b239d78b830a0dac3a95e202214895e5565deba372072b56d6eaf81864" exitCode=1 Mar 07 21:27:08.582362 master-0 kubenswrapper[16352]: I0307 21:27:08.582304 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerDied","Data":"b875c3b239d78b830a0dac3a95e202214895e5565deba372072b56d6eaf81864"} Mar 07 21:27:08.582508 master-0 kubenswrapper[16352]: I0307 21:27:08.582365 16352 scope.go:117] "RemoveContainer" containerID="8ba80cee3d89d3d6b976aac0e2f007c4e112b08741be4d9e1220847381797dab" Mar 07 21:27:08.584335 master-0 kubenswrapper[16352]: I0307 21:27:08.583872 16352 scope.go:117] "RemoveContainer" containerID="b875c3b239d78b830a0dac3a95e202214895e5565deba372072b56d6eaf81864" Mar 07 21:27:08.584474 master-0 kubenswrapper[16352]: E0307 21:27:08.584367 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-baremetal-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=cluster-baremetal-operator pod=cluster-baremetal-operator-5cdb4c5598-nmwjr_openshift-machine-api(a61a736a-66e5-4ca1-a8a7-088cf73cfcce)\"" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" podUID="a61a736a-66e5-4ca1-a8a7-088cf73cfcce" Mar 07 21:27:09.590676 master-0 kubenswrapper[16352]: I0307 21:27:09.590565 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/1.log" Mar 07 21:27:12.509663 master-0 kubenswrapper[16352]: I0307 21:27:12.509546 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:27:12.511346 master-0 kubenswrapper[16352]: I0307 21:27:12.509768 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:27:15.509948 master-0 kubenswrapper[16352]: I0307 21:27:15.509790 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:27:15.511131 master-0 kubenswrapper[16352]: I0307 21:27:15.509949 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:27:17.380057 master-0 kubenswrapper[16352]: E0307 21:27:17.379933 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:27:18.577670 master-0 kubenswrapper[16352]: I0307 21:27:18.577550 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:18.577670 master-0 kubenswrapper[16352]: I0307 21:27:18.577648 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:19.190136 master-0 kubenswrapper[16352]: I0307 21:27:19.190051 16352 scope.go:117] "RemoveContainer" containerID="b875c3b239d78b830a0dac3a95e202214895e5565deba372072b56d6eaf81864" Mar 07 21:27:19.697594 master-0 kubenswrapper[16352]: I0307 21:27:19.697512 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-baremetal-operator-5cdb4c5598-nmwjr_a61a736a-66e5-4ca1-a8a7-088cf73cfcce/cluster-baremetal-operator/1.log" Mar 07 21:27:19.699058 master-0 kubenswrapper[16352]: I0307 21:27:19.698984 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-baremetal-operator-5cdb4c5598-nmwjr" event={"ID":"a61a736a-66e5-4ca1-a8a7-088cf73cfcce","Type":"ContainerStarted","Data":"d67c39c033a38e29daa4f2b15ecbf54d5113ffe8a8df9ac7d0580e6c42cb22ef"} Mar 07 21:27:25.510023 master-0 kubenswrapper[16352]: I0307 21:27:25.509768 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:27:25.511605 master-0 kubenswrapper[16352]: I0307 21:27:25.510138 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:27:28.576468 master-0 kubenswrapper[16352]: I0307 21:27:28.576368 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:28.576468 master-0 kubenswrapper[16352]: I0307 21:27:28.576434 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:34.382100 master-0 kubenswrapper[16352]: E0307 21:27:34.381908 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:27:35.509154 master-0 kubenswrapper[16352]: I0307 21:27:35.509067 16352 patch_prober.go:28] interesting pod/kube-controller-manager-master-0 container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:27:35.510171 master-0 kubenswrapper[16352]: I0307 21:27:35.509924 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://localhost:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:27:35.510171 master-0 kubenswrapper[16352]: I0307 21:27:35.510077 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:27:35.511599 master-0 kubenswrapper[16352]: I0307 21:27:35.511534 16352 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} pod="openshift-kube-controller-manager/kube-controller-manager-master-0" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 07 21:27:35.511852 master-0 kubenswrapper[16352]: I0307 21:27:35.511787 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" containerID="cri-o://e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" gracePeriod=30 Mar 07 21:27:35.648348 master-0 kubenswrapper[16352]: E0307 21:27:35.648254 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(1c6f1e263aa1f0a5ac95d2a74e2c146c)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" Mar 07 21:27:35.772262 master-0 kubenswrapper[16352]: E0307 21:27:35.772053 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c6f1e263aa1f0a5ac95d2a74e2c146c.slice/crio-conmon-e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:27:35.872849 master-0 kubenswrapper[16352]: I0307 21:27:35.872764 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/3.log" Mar 07 21:27:35.874122 master-0 kubenswrapper[16352]: I0307 21:27:35.874040 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/2.log" Mar 07 21:27:35.877160 master-0 kubenswrapper[16352]: I0307 21:27:35.877054 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:27:35.877319 master-0 kubenswrapper[16352]: I0307 21:27:35.877250 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" exitCode=255 Mar 07 21:27:35.877421 master-0 kubenswrapper[16352]: I0307 21:27:35.877346 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerDied","Data":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} Mar 07 21:27:35.877515 master-0 kubenswrapper[16352]: I0307 21:27:35.877436 16352 scope.go:117] "RemoveContainer" containerID="3debe4852fde09f4a1fc5bd2b8216ed2c060e15b27dcbf1b53c66c578bc7d637" Mar 07 21:27:35.879949 master-0 kubenswrapper[16352]: I0307 21:27:35.879762 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:27:35.880871 master-0 kubenswrapper[16352]: E0307 21:27:35.880673 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(1c6f1e263aa1f0a5ac95d2a74e2c146c)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" Mar 07 21:27:36.894181 master-0 kubenswrapper[16352]: I0307 21:27:36.894099 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/3.log" Mar 07 21:27:36.899073 master-0 kubenswrapper[16352]: I0307 21:27:36.898996 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:27:38.577559 master-0 kubenswrapper[16352]: I0307 21:27:38.577465 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:38.577559 master-0 kubenswrapper[16352]: I0307 21:27:38.577547 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:42.508930 master-0 kubenswrapper[16352]: I0307 21:27:42.508742 16352 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:27:42.510594 master-0 kubenswrapper[16352]: I0307 21:27:42.510550 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:27:42.511135 master-0 kubenswrapper[16352]: E0307 21:27:42.511104 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(1c6f1e263aa1f0a5ac95d2a74e2c146c)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" Mar 07 21:27:48.577296 master-0 kubenswrapper[16352]: I0307 21:27:48.577200 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:48.577296 master-0 kubenswrapper[16352]: I0307 21:27:48.577295 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:51.384368 master-0 kubenswrapper[16352]: E0307 21:27:51.384243 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" interval="7s" Mar 07 21:27:52.673189 master-0 kubenswrapper[16352]: I0307 21:27:52.673049 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: E0307 21:27:52.673553 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96e31400-86e3-46d2-97ee-12fd3e17893a" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673570 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="96e31400-86e3-46d2-97ee-12fd3e17893a" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: E0307 21:27:52.673597 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23d4915d-4b88-4875-b794-414b5b7a1d7b" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673605 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="23d4915d-4b88-4875-b794-414b5b7a1d7b" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: E0307 21:27:52.673636 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673646 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: E0307 21:27:52.673671 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72b4d517-f9c1-4fb2-9217-bd02b6838b07" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673702 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="72b4d517-f9c1-4fb2-9217-bd02b6838b07" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673919 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="23d4915d-4b88-4875-b794-414b5b7a1d7b" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673936 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ffa111b-1cdd-47b9-b015-a7c4ad4c0f5d" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673960 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="96e31400-86e3-46d2-97ee-12fd3e17893a" containerName="installer" Mar 07 21:27:52.674214 master-0 kubenswrapper[16352]: I0307 21:27:52.673981 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="72b4d517-f9c1-4fb2-9217-bd02b6838b07" containerName="installer" Mar 07 21:27:52.675402 master-0 kubenswrapper[16352]: I0307 21:27:52.674644 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.678249 master-0 kubenswrapper[16352]: I0307 21:27:52.677326 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-xlnsg" Mar 07 21:27:52.678507 master-0 kubenswrapper[16352]: I0307 21:27:52.678414 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 07 21:27:52.680064 master-0 kubenswrapper[16352]: I0307 21:27:52.679666 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/installer-5-retry-1-master-0"] Mar 07 21:27:52.682675 master-0 kubenswrapper[16352]: I0307 21:27:52.681377 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.684546 master-0 kubenswrapper[16352]: I0307 21:27:52.684444 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler"/"installer-sa-dockercfg-8cblb" Mar 07 21:27:52.684860 master-0 kubenswrapper[16352]: I0307 21:27:52.684774 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler"/"kube-root-ca.crt" Mar 07 21:27:52.697671 master-0 kubenswrapper[16352]: I0307 21:27:52.697597 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 07 21:27:52.702547 master-0 kubenswrapper[16352]: I0307 21:27:52.702460 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-retry-1-master-0"] Mar 07 21:27:52.829845 master-0 kubenswrapper[16352]: I0307 21:27:52.829714 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.829845 master-0 kubenswrapper[16352]: I0307 21:27:52.829817 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.830451 master-0 kubenswrapper[16352]: I0307 21:27:52.830337 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.830603 master-0 kubenswrapper[16352]: I0307 21:27:52.830553 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.830716 master-0 kubenswrapper[16352]: I0307 21:27:52.830623 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.830951 master-0 kubenswrapper[16352]: I0307 21:27:52.830906 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.934050 master-0 kubenswrapper[16352]: I0307 21:27:52.933812 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.934050 master-0 kubenswrapper[16352]: I0307 21:27:52.933953 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.934050 master-0 kubenswrapper[16352]: I0307 21:27:52.933960 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.934473 master-0 kubenswrapper[16352]: I0307 21:27:52.934138 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.934473 master-0 kubenswrapper[16352]: I0307 21:27:52.934274 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.934473 master-0 kubenswrapper[16352]: I0307 21:27:52.934319 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.934473 master-0 kubenswrapper[16352]: I0307 21:27:52.934403 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.934793 master-0 kubenswrapper[16352]: I0307 21:27:52.934485 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.937469 master-0 kubenswrapper[16352]: I0307 21:27:52.937366 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.937911 master-0 kubenswrapper[16352]: I0307 21:27:52.937784 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:52.956882 master-0 kubenswrapper[16352]: I0307 21:27:52.956829 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access\") pod \"installer-5-retry-1-master-0\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:52.961343 master-0 kubenswrapper[16352]: I0307 21:27:52.961274 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access\") pod \"installer-5-master-0\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:53.056251 master-0 kubenswrapper[16352]: I0307 21:27:53.056150 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:27:53.094315 master-0 kubenswrapper[16352]: I0307 21:27:53.092968 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:27:53.528977 master-0 kubenswrapper[16352]: I0307 21:27:53.528872 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/installer-5-master-0"] Mar 07 21:27:53.584069 master-0 kubenswrapper[16352]: I0307 21:27:53.583981 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler/installer-5-retry-1-master-0"] Mar 07 21:27:53.593794 master-0 kubenswrapper[16352]: W0307 21:27:53.593717 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poda69242fc_53d6_48f5_82a9_52daf194d047.slice/crio-f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09 WatchSource:0}: Error finding container f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09: Status 404 returned error can't find the container with id f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09 Mar 07 21:27:54.083873 master-0 kubenswrapper[16352]: I0307 21:27:54.083599 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"5be2b4e6-94a5-4282-ba6f-86dc7634a28d","Type":"ContainerStarted","Data":"91ae59cbf2b5cf1a678305d658269b272dfb65a219a968058791a1ac204a5668"} Mar 07 21:27:54.083873 master-0 kubenswrapper[16352]: I0307 21:27:54.083747 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"5be2b4e6-94a5-4282-ba6f-86dc7634a28d","Type":"ContainerStarted","Data":"56dff1fc3fed95aa37852fd9137401ab4752e0ca9f4ffe1ccf3ee9d5239ec6bd"} Mar 07 21:27:54.087709 master-0 kubenswrapper[16352]: I0307 21:27:54.087622 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" event={"ID":"a69242fc-53d6-48f5-82a9-52daf194d047","Type":"ContainerStarted","Data":"f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09"} Mar 07 21:27:54.112916 master-0 kubenswrapper[16352]: I0307 21:27:54.112785 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/installer-5-master-0" podStartSLOduration=2.1127563 podStartE2EDuration="2.1127563s" podCreationTimestamp="2026-03-07 21:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:27:54.110301502 +0000 UTC m=+597.181006601" watchObservedRunningTime="2026-03-07 21:27:54.1127563 +0000 UTC m=+597.183461369" Mar 07 21:27:55.101434 master-0 kubenswrapper[16352]: I0307 21:27:55.101320 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" event={"ID":"a69242fc-53d6-48f5-82a9-52daf194d047","Type":"ContainerStarted","Data":"0e46543b367b8d876d0a1dfbab15863e93b3021665df29db594c1bdd53d219ed"} Mar 07 21:27:55.137902 master-0 kubenswrapper[16352]: I0307 21:27:55.137755 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" podStartSLOduration=3.137722607 podStartE2EDuration="3.137722607s" podCreationTimestamp="2026-03-07 21:27:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:27:55.128403026 +0000 UTC m=+598.199108135" watchObservedRunningTime="2026-03-07 21:27:55.137722607 +0000 UTC m=+598.208427706" Mar 07 21:27:56.191575 master-0 kubenswrapper[16352]: I0307 21:27:56.191463 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:27:56.192788 master-0 kubenswrapper[16352]: E0307 21:27:56.192164 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(1c6f1e263aa1f0a5ac95d2a74e2c146c)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" Mar 07 21:27:58.137846 master-0 kubenswrapper[16352]: I0307 21:27:58.137663 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-dqvvb_b12701eb-4226-4f9c-9398-ad0c3fea7451/cluster-autoscaler-operator/0.log" Mar 07 21:27:58.139225 master-0 kubenswrapper[16352]: I0307 21:27:58.138917 16352 generic.go:334] "Generic (PLEG): container finished" podID="b12701eb-4226-4f9c-9398-ad0c3fea7451" containerID="d1fc671510809b5ce34fe6d8c109ba8c0532578d622b3287779a089fe73faa48" exitCode=255 Mar 07 21:27:58.139225 master-0 kubenswrapper[16352]: I0307 21:27:58.139041 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" event={"ID":"b12701eb-4226-4f9c-9398-ad0c3fea7451","Type":"ContainerDied","Data":"d1fc671510809b5ce34fe6d8c109ba8c0532578d622b3287779a089fe73faa48"} Mar 07 21:27:58.140018 master-0 kubenswrapper[16352]: I0307 21:27:58.139906 16352 scope.go:117] "RemoveContainer" containerID="d1fc671510809b5ce34fe6d8c109ba8c0532578d622b3287779a089fe73faa48" Mar 07 21:27:58.145509 master-0 kubenswrapper[16352]: I0307 21:27:58.145440 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_network-operator-7c649bf6d4-v4xm9_f8980370-267c-4168-ba97-d780698533ff/network-operator/0.log" Mar 07 21:27:58.145608 master-0 kubenswrapper[16352]: I0307 21:27:58.145551 16352 generic.go:334] "Generic (PLEG): container finished" podID="f8980370-267c-4168-ba97-d780698533ff" containerID="0c0b389df5a30d4ee03cfc1ba37848c4943ddd2770dea8c045d43b6813299002" exitCode=0 Mar 07 21:27:58.145773 master-0 kubenswrapper[16352]: I0307 21:27:58.145625 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerDied","Data":"0c0b389df5a30d4ee03cfc1ba37848c4943ddd2770dea8c045d43b6813299002"} Mar 07 21:27:58.145850 master-0 kubenswrapper[16352]: I0307 21:27:58.145775 16352 scope.go:117] "RemoveContainer" containerID="a365b415335d369b3b6313971188bcd1400d9e9f3efd23b32ee5ec456091c9db" Mar 07 21:27:58.146789 master-0 kubenswrapper[16352]: I0307 21:27:58.146638 16352 scope.go:117] "RemoveContainer" containerID="0c0b389df5a30d4ee03cfc1ba37848c4943ddd2770dea8c045d43b6813299002" Mar 07 21:27:58.161449 master-0 kubenswrapper[16352]: I0307 21:27:58.158998 16352 generic.go:334] "Generic (PLEG): container finished" podID="2369ce94-237f-41ad-9875-173578764483" containerID="445492e4e6d40332995014dd6be660b4fadf0d896d317c849ff3f3a4ae8887c6" exitCode=0 Mar 07 21:27:58.161449 master-0 kubenswrapper[16352]: I0307 21:27:58.159256 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" event={"ID":"2369ce94-237f-41ad-9875-173578764483","Type":"ContainerDied","Data":"445492e4e6d40332995014dd6be660b4fadf0d896d317c849ff3f3a4ae8887c6"} Mar 07 21:27:58.161449 master-0 kubenswrapper[16352]: I0307 21:27:58.160592 16352 scope.go:117] "RemoveContainer" containerID="445492e4e6d40332995014dd6be660b4fadf0d896d317c849ff3f3a4ae8887c6" Mar 07 21:27:58.170232 master-0 kubenswrapper[16352]: I0307 21:27:58.170102 16352 generic.go:334] "Generic (PLEG): container finished" podID="96cfa9d3-fc26-42e9-8bac-ff2c25223654" containerID="a74860c7253b102381265bd05ca71aeed0e3588566e0a6daa693749f3e14d87d" exitCode=0 Mar 07 21:27:58.170397 master-0 kubenswrapper[16352]: I0307 21:27:58.170314 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" event={"ID":"96cfa9d3-fc26-42e9-8bac-ff2c25223654","Type":"ContainerDied","Data":"a74860c7253b102381265bd05ca71aeed0e3588566e0a6daa693749f3e14d87d"} Mar 07 21:27:58.172169 master-0 kubenswrapper[16352]: I0307 21:27:58.172114 16352 scope.go:117] "RemoveContainer" containerID="a74860c7253b102381265bd05ca71aeed0e3588566e0a6daa693749f3e14d87d" Mar 07 21:27:58.186303 master-0 kubenswrapper[16352]: I0307 21:27:58.186217 16352 generic.go:334] "Generic (PLEG): container finished" podID="8269652e-360f-43ef-9e7d-473c5f478275" containerID="bb011da2147b400e02dca81678c2674f7f4945ae82c7c12a0ca2e2e7f531abc9" exitCode=0 Mar 07 21:27:58.186468 master-0 kubenswrapper[16352]: I0307 21:27:58.186333 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerDied","Data":"bb011da2147b400e02dca81678c2674f7f4945ae82c7c12a0ca2e2e7f531abc9"} Mar 07 21:27:58.187357 master-0 kubenswrapper[16352]: I0307 21:27:58.187303 16352 scope.go:117] "RemoveContainer" containerID="bb011da2147b400e02dca81678c2674f7f4945ae82c7c12a0ca2e2e7f531abc9" Mar 07 21:27:58.194120 master-0 kubenswrapper[16352]: I0307 21:27:58.194046 16352 generic.go:334] "Generic (PLEG): container finished" podID="ff7c5ff2-49d2-4a55-96d1-5244ae8ad602" containerID="c83498128763a2f148ac39982dea44c5fce21b488aae118bfb334b72079782c3" exitCode=0 Mar 07 21:27:58.194348 master-0 kubenswrapper[16352]: I0307 21:27:58.194130 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerDied","Data":"c83498128763a2f148ac39982dea44c5fce21b488aae118bfb334b72079782c3"} Mar 07 21:27:58.194651 master-0 kubenswrapper[16352]: I0307 21:27:58.194560 16352 scope.go:117] "RemoveContainer" containerID="c83498128763a2f148ac39982dea44c5fce21b488aae118bfb334b72079782c3" Mar 07 21:27:58.197146 master-0 kubenswrapper[16352]: I0307 21:27:58.197080 16352 generic.go:334] "Generic (PLEG): container finished" podID="dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2" containerID="3d9119a026d90b8ec2d78d2795489aa4c35f51d54ccaf8a6982c9cbfecf34cd0" exitCode=0 Mar 07 21:27:58.197146 master-0 kubenswrapper[16352]: I0307 21:27:58.197142 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" event={"ID":"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2","Type":"ContainerDied","Data":"3d9119a026d90b8ec2d78d2795489aa4c35f51d54ccaf8a6982c9cbfecf34cd0"} Mar 07 21:27:58.197511 master-0 kubenswrapper[16352]: I0307 21:27:58.197461 16352 scope.go:117] "RemoveContainer" containerID="3d9119a026d90b8ec2d78d2795489aa4c35f51d54ccaf8a6982c9cbfecf34cd0" Mar 07 21:27:58.201328 master-0 kubenswrapper[16352]: I0307 21:27:58.201203 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-t8jw4_46d1b044-16fb-4442-a554-6b15a8a1c8ae/machine-api-operator/0.log" Mar 07 21:27:58.201997 master-0 kubenswrapper[16352]: I0307 21:27:58.201886 16352 generic.go:334] "Generic (PLEG): container finished" podID="46d1b044-16fb-4442-a554-6b15a8a1c8ae" containerID="22bb9c50e586557c26e348d932ac5dad20b01bd083cb9c200964357361e20692" exitCode=255 Mar 07 21:27:58.201997 master-0 kubenswrapper[16352]: I0307 21:27:58.201942 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" event={"ID":"46d1b044-16fb-4442-a554-6b15a8a1c8ae","Type":"ContainerDied","Data":"22bb9c50e586557c26e348d932ac5dad20b01bd083cb9c200964357361e20692"} Mar 07 21:27:58.203249 master-0 kubenswrapper[16352]: I0307 21:27:58.203169 16352 scope.go:117] "RemoveContainer" containerID="22bb9c50e586557c26e348d932ac5dad20b01bd083cb9c200964357361e20692" Mar 07 21:27:58.207954 master-0 kubenswrapper[16352]: I0307 21:27:58.207865 16352 generic.go:334] "Generic (PLEG): container finished" podID="abfb5602-7255-43d7-a510-e7f94885887e" containerID="39007789be66eb488faa54345a100705571cfad0f002f23e0dcd219cdce1ebd3" exitCode=0 Mar 07 21:27:58.208255 master-0 kubenswrapper[16352]: I0307 21:27:58.208013 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerDied","Data":"39007789be66eb488faa54345a100705571cfad0f002f23e0dcd219cdce1ebd3"} Mar 07 21:27:58.209082 master-0 kubenswrapper[16352]: I0307 21:27:58.209031 16352 scope.go:117] "RemoveContainer" containerID="39007789be66eb488faa54345a100705571cfad0f002f23e0dcd219cdce1ebd3" Mar 07 21:27:58.219327 master-0 kubenswrapper[16352]: I0307 21:27:58.219257 16352 generic.go:334] "Generic (PLEG): container finished" podID="bd9cf577-3c49-417b-a6c0-9d307c113221" containerID="8fa812b3126769f1b859d734a7a96fc03f149ac91f0eb8368e542c55f6e18fc4" exitCode=0 Mar 07 21:27:58.219510 master-0 kubenswrapper[16352]: I0307 21:27:58.219392 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" event={"ID":"bd9cf577-3c49-417b-a6c0-9d307c113221","Type":"ContainerDied","Data":"8fa812b3126769f1b859d734a7a96fc03f149ac91f0eb8368e542c55f6e18fc4"} Mar 07 21:27:58.220357 master-0 kubenswrapper[16352]: I0307 21:27:58.220311 16352 scope.go:117] "RemoveContainer" containerID="8fa812b3126769f1b859d734a7a96fc03f149ac91f0eb8368e542c55f6e18fc4" Mar 07 21:27:58.227545 master-0 kubenswrapper[16352]: I0307 21:27:58.227468 16352 generic.go:334] "Generic (PLEG): container finished" podID="5b339e6a-cae6-416a-963b-2fd23cecba96" containerID="462d721f7750425af90d3f273635e726bcc5aa1beb2ca22700d6eca8c4a03024" exitCode=0 Mar 07 21:27:58.227700 master-0 kubenswrapper[16352]: I0307 21:27:58.227584 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerDied","Data":"462d721f7750425af90d3f273635e726bcc5aa1beb2ca22700d6eca8c4a03024"} Mar 07 21:27:58.229103 master-0 kubenswrapper[16352]: I0307 21:27:58.228984 16352 scope.go:117] "RemoveContainer" containerID="462d721f7750425af90d3f273635e726bcc5aa1beb2ca22700d6eca8c4a03024" Mar 07 21:27:58.242798 master-0 kubenswrapper[16352]: I0307 21:27:58.242735 16352 generic.go:334] "Generic (PLEG): container finished" podID="5446df8b-23d4-4bf3-84ac-d8e1d18813af" containerID="404e2d81c63c2fba1b320a0b186d8fcfd12c559ca83bb773f1ba38fc1d224277" exitCode=0 Mar 07 21:27:58.242960 master-0 kubenswrapper[16352]: I0307 21:27:58.242815 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" event={"ID":"5446df8b-23d4-4bf3-84ac-d8e1d18813af","Type":"ContainerDied","Data":"404e2d81c63c2fba1b320a0b186d8fcfd12c559ca83bb773f1ba38fc1d224277"} Mar 07 21:27:58.243585 master-0 kubenswrapper[16352]: I0307 21:27:58.243500 16352 scope.go:117] "RemoveContainer" containerID="404e2d81c63c2fba1b320a0b186d8fcfd12c559ca83bb773f1ba38fc1d224277" Mar 07 21:27:58.246888 master-0 kubenswrapper[16352]: I0307 21:27:58.246671 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-sxqnh_f8c93e0d-54e5-4c80-9d69-a70317baeacf/cluster-node-tuning-operator/0.log" Mar 07 21:27:58.247085 master-0 kubenswrapper[16352]: I0307 21:27:58.246894 16352 generic.go:334] "Generic (PLEG): container finished" podID="f8c93e0d-54e5-4c80-9d69-a70317baeacf" containerID="ce277045c24d296bd4d74241d9987bda75f36597494e88ec64b1272f784cd2cf" exitCode=1 Mar 07 21:27:58.247085 master-0 kubenswrapper[16352]: I0307 21:27:58.246950 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" event={"ID":"f8c93e0d-54e5-4c80-9d69-a70317baeacf","Type":"ContainerDied","Data":"ce277045c24d296bd4d74241d9987bda75f36597494e88ec64b1272f784cd2cf"} Mar 07 21:27:58.247477 master-0 kubenswrapper[16352]: I0307 21:27:58.247395 16352 scope.go:117] "RemoveContainer" containerID="ce277045c24d296bd4d74241d9987bda75f36597494e88ec64b1272f784cd2cf" Mar 07 21:27:58.259758 master-0 kubenswrapper[16352]: I0307 21:27:58.259641 16352 generic.go:334] "Generic (PLEG): container finished" podID="24f69689-ff12-4786-af05-61429e9eadf8" containerID="f61c9664ad5014f7591f08646987ba716f66b5b9dca224d83b995060556f0add" exitCode=0 Mar 07 21:27:58.259887 master-0 kubenswrapper[16352]: I0307 21:27:58.259830 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerDied","Data":"f61c9664ad5014f7591f08646987ba716f66b5b9dca224d83b995060556f0add"} Mar 07 21:27:58.260791 master-0 kubenswrapper[16352]: I0307 21:27:58.260676 16352 scope.go:117] "RemoveContainer" containerID="f61c9664ad5014f7591f08646987ba716f66b5b9dca224d83b995060556f0add" Mar 07 21:27:58.279873 master-0 kubenswrapper[16352]: I0307 21:27:58.277244 16352 generic.go:334] "Generic (PLEG): container finished" podID="ab2f6566-730d-46f5-92ed-79e3039d24e8" containerID="6cccee54a91d1198afbca96aa8060f7dbbf6cd82c693fb0dbe258a47b31e07b2" exitCode=0 Mar 07 21:27:58.279873 master-0 kubenswrapper[16352]: I0307 21:27:58.277290 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" event={"ID":"ab2f6566-730d-46f5-92ed-79e3039d24e8","Type":"ContainerDied","Data":"6cccee54a91d1198afbca96aa8060f7dbbf6cd82c693fb0dbe258a47b31e07b2"} Mar 07 21:27:58.279873 master-0 kubenswrapper[16352]: I0307 21:27:58.278326 16352 scope.go:117] "RemoveContainer" containerID="6cccee54a91d1198afbca96aa8060f7dbbf6cd82c693fb0dbe258a47b31e07b2" Mar 07 21:27:58.332770 master-0 kubenswrapper[16352]: I0307 21:27:58.332651 16352 scope.go:117] "RemoveContainer" containerID="e4c20cfb39db1342bdb31f41fc9c1caf9efa43065ea9e9334f061db96ddead54" Mar 07 21:27:58.556994 master-0 kubenswrapper[16352]: I0307 21:27:58.556849 16352 scope.go:117] "RemoveContainer" containerID="ee323378e5f254b4936ebddaed79c44e072c4abc42a4ea5e2f28f2991df5cf33" Mar 07 21:27:58.577320 master-0 kubenswrapper[16352]: I0307 21:27:58.577256 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:27:58.578084 master-0 kubenswrapper[16352]: I0307 21:27:58.577923 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:27:58.638579 master-0 kubenswrapper[16352]: I0307 21:27:58.637968 16352 scope.go:117] "RemoveContainer" containerID="98e7e40d5b40416680e1b256712d9b6487df5695b6f01c16e2334511df19f429" Mar 07 21:27:58.679881 master-0 kubenswrapper[16352]: I0307 21:27:58.679837 16352 scope.go:117] "RemoveContainer" containerID="4ce1bc8e249944d7cde9f138282cf087a8521cf190e44f0f1b32f20172ea8a91" Mar 07 21:27:58.725171 master-0 kubenswrapper[16352]: I0307 21:27:58.725124 16352 scope.go:117] "RemoveContainer" containerID="c541936d2c1e33ad24f13bb7de438be39b6542e54689f0c9212561c0b1fef232" Mar 07 21:27:59.287284 master-0 kubenswrapper[16352]: I0307 21:27:59.287123 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-69b6fc6b88-cg9rz" event={"ID":"24f69689-ff12-4786-af05-61429e9eadf8","Type":"ContainerStarted","Data":"4e491353d25b2c5150c1afba58d3f0d3f3e4b0c6a7f1603d4505c25224fbb5de"} Mar 07 21:27:59.289366 master-0 kubenswrapper[16352]: I0307 21:27:59.289317 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-ff46b7bdf-55p6v" event={"ID":"5446df8b-23d4-4bf3-84ac-d8e1d18813af","Type":"ContainerStarted","Data":"13adf37be912179509ee9160b4b66e8de46c9d8ba4a47b3ff1ece5961490178a"} Mar 07 21:27:59.293575 master-0 kubenswrapper[16352]: I0307 21:27:59.293527 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-olm-operator/cluster-olm-operator-77899cf6d-cgdkk" event={"ID":"8269652e-360f-43ef-9e7d-473c5f478275","Type":"ContainerStarted","Data":"d727d42e330bb975ad189e3b539db6a75b0f30a1ed331a9b2f4eae2b83e88f60"} Mar 07 21:27:59.299927 master-0 kubenswrapper[16352]: I0307 21:27:59.299869 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-68bd585b-qnhrz" event={"ID":"5b339e6a-cae6-416a-963b-2fd23cecba96","Type":"ContainerStarted","Data":"086f07d9d67c0f38e863822bc928f3acb919cc18f1ac8f857520eea31d9c6be6"} Mar 07 21:27:59.304027 master-0 kubenswrapper[16352]: I0307 21:27:59.303993 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_cluster-autoscaler-operator-69576476f7-dqvvb_b12701eb-4226-4f9c-9398-ad0c3fea7451/cluster-autoscaler-operator/0.log" Mar 07 21:27:59.304643 master-0 kubenswrapper[16352]: I0307 21:27:59.304610 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/cluster-autoscaler-operator-69576476f7-dqvvb" event={"ID":"b12701eb-4226-4f9c-9398-ad0c3fea7451","Type":"ContainerStarted","Data":"1d31da760bd01911c7e47cd468a7e88616b02b3060df5e71d869aa1784ae154b"} Mar 07 21:27:59.306981 master-0 kubenswrapper[16352]: I0307 21:27:59.306936 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-8c9c967c7-s44f4" event={"ID":"96cfa9d3-fc26-42e9-8bac-ff2c25223654","Type":"ContainerStarted","Data":"90e5d77c07f9c7e796be977e92e90e18d8c967a25ac80d1e76dc40addac44096"} Mar 07 21:27:59.312890 master-0 kubenswrapper[16352]: I0307 21:27:59.312860 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/csi-snapshot-controller-operator-5685fbc7d-txnh5" event={"ID":"ab2f6566-730d-46f5-92ed-79e3039d24e8","Type":"ContainerStarted","Data":"d0f48e3d7059bb0a52600a204b915e1246f623fcc04f4b1ba8f6fe638a1f5435"} Mar 07 21:27:59.342708 master-0 kubenswrapper[16352]: I0307 21:27:59.338639 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-node-tuning-operator_cluster-node-tuning-operator-66c7586884-sxqnh_f8c93e0d-54e5-4c80-9d69-a70317baeacf/cluster-node-tuning-operator/0.log" Mar 07 21:27:59.342708 master-0 kubenswrapper[16352]: I0307 21:27:59.338836 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/cluster-node-tuning-operator-66c7586884-sxqnh" event={"ID":"f8c93e0d-54e5-4c80-9d69-a70317baeacf","Type":"ContainerStarted","Data":"4a4a75ef16ffdd6a40d9bb7435626ab1f476fd80d80a250d807d32f1bb519d61"} Mar 07 21:27:59.351566 master-0 kubenswrapper[16352]: I0307 21:27:59.350649 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-86d7cdfdfb-wb26b" event={"ID":"abfb5602-7255-43d7-a510-e7f94885887e","Type":"ContainerStarted","Data":"e64a55dc031663cc4ff3aaee55500228d7c3ebf37265761d11593474ad58186c"} Mar 07 21:27:59.360694 master-0 kubenswrapper[16352]: I0307 21:27:59.359416 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/cluster-storage-operator-6fbfc8dc8f-v48jn" event={"ID":"bd9cf577-3c49-417b-a6c0-9d307c113221","Type":"ContainerStarted","Data":"980a3113e50fac263705d7d89fb2b05255e08dc3574c1e3479ab5ab62f47aa6d"} Mar 07 21:27:59.364696 master-0 kubenswrapper[16352]: I0307 21:27:59.363902 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-84bfdbbb7f-h76wh" event={"ID":"2369ce94-237f-41ad-9875-173578764483","Type":"ContainerStarted","Data":"324aa5c5f93d19073981d865f1cff5010035a97389b80121d7e08b52e6f83d6e"} Mar 07 21:27:59.381712 master-0 kubenswrapper[16352]: I0307 21:27:59.372662 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-84bf6db4f9-t8jw4_46d1b044-16fb-4442-a554-6b15a8a1c8ae/machine-api-operator/0.log" Mar 07 21:27:59.381712 master-0 kubenswrapper[16352]: I0307 21:27:59.373355 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-84bf6db4f9-t8jw4" event={"ID":"46d1b044-16fb-4442-a554-6b15a8a1c8ae","Type":"ContainerStarted","Data":"2f456f138af73d5ce014356b59076eb7edb229bbcc2153090b736e600b468189"} Mar 07 21:27:59.381712 master-0 kubenswrapper[16352]: I0307 21:27:59.378213 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-7c6989d6c4-7w8wf" event={"ID":"ff7c5ff2-49d2-4a55-96d1-5244ae8ad602","Type":"ContainerStarted","Data":"19a6147f35aed12304ff555f5d2943c4800e41d555427fdf61830333790009fc"} Mar 07 21:27:59.381712 master-0 kubenswrapper[16352]: I0307 21:27:59.381279 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-86d6d77c7c-kg26q" event={"ID":"dfcd12b2-b4e2-4d89-b75e-f7761f4e41d2","Type":"ContainerStarted","Data":"34194f59d2fb4893369298dd15f9dddf2c8b9ab0cbbfe8ec609641aee9a50fd2"} Mar 07 21:27:59.390698 master-0 kubenswrapper[16352]: I0307 21:27:59.389430 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-7c649bf6d4-v4xm9" event={"ID":"f8980370-267c-4168-ba97-d780698533ff","Type":"ContainerStarted","Data":"82a7a7c8aa37f8d8ed4e0ead8b2b0c17e648ffadef6f5fe33a3a3da4e8a8d8c6"} Mar 07 21:28:08.577035 master-0 kubenswrapper[16352]: I0307 21:28:08.576926 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:08.577035 master-0 kubenswrapper[16352]: I0307 21:28:08.577028 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:28:08.578266 master-0 kubenswrapper[16352]: I0307 21:28:08.577097 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:28:08.578266 master-0 kubenswrapper[16352]: I0307 21:28:08.578042 16352 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="console" containerStatusID={"Type":"cri-o","ID":"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371"} pod="openshift-console/console-64d844fb5f-9b28j" containerMessage="Container console failed startup probe, will be restarted" Mar 07 21:28:09.181103 master-0 kubenswrapper[16352]: E0307 21:28:09.181015 16352 handlers.go:78] "Exec lifecycle hook for Container in Pod failed" err="command 'sleep 25' exited with 137: " execCommand=["sleep","25"] containerName="console" pod="openshift-console/console-64d844fb5f-9b28j" message="" Mar 07 21:28:09.181381 master-0 kubenswrapper[16352]: E0307 21:28:09.181331 16352 kuberuntime_container.go:691] "PreStop hook failed" err="command 'sleep 25' exited with 137: " pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" containerID="cri-o://79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" Mar 07 21:28:09.181480 master-0 kubenswrapper[16352]: I0307 21:28:09.181412 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" containerID="cri-o://79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" gracePeriod=40 Mar 07 21:28:09.512858 master-0 kubenswrapper[16352]: I0307 21:28:09.512761 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d844fb5f-9b28j_253bb615-1b60-4112-aee8-f572d1c84114/console/0.log" Mar 07 21:28:09.513155 master-0 kubenswrapper[16352]: I0307 21:28:09.512878 16352 generic.go:334] "Generic (PLEG): container finished" podID="253bb615-1b60-4112-aee8-f572d1c84114" containerID="79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" exitCode=255 Mar 07 21:28:09.513155 master-0 kubenswrapper[16352]: I0307 21:28:09.512944 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerDied","Data":"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371"} Mar 07 21:28:09.513155 master-0 kubenswrapper[16352]: I0307 21:28:09.513013 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerStarted","Data":"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699"} Mar 07 21:28:11.189386 master-0 kubenswrapper[16352]: I0307 21:28:11.189297 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:11.190178 master-0 kubenswrapper[16352]: E0307 21:28:11.189583 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cluster-policy-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=cluster-policy-controller pod=kube-controller-manager-master-0_openshift-kube-controller-manager(1c6f1e263aa1f0a5ac95d2a74e2c146c)\"" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" Mar 07 21:28:16.051253 master-0 kubenswrapper[16352]: I0307 21:28:16.051174 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:28:16.052494 master-0 kubenswrapper[16352]: I0307 21:28:16.052447 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="prometheus" containerID="cri-o://80d8dc37b0ac916e93ffc8cf045352f9940999409a2ec6b57f769d6ce37829e8" gracePeriod=600 Mar 07 21:28:16.052747 master-0 kubenswrapper[16352]: I0307 21:28:16.052620 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="config-reloader" containerID="cri-o://a8467b33ca9f402e1c71e8d3f6e98a801a76f328df657df0db479d82b62f50e3" gracePeriod=600 Mar 07 21:28:16.052835 master-0 kubenswrapper[16352]: I0307 21:28:16.052490 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy" containerID="cri-o://2838178b4f1ff502863f7dcc3d2546a40d8338e52a2be0345ddd81cb81b6cfa5" gracePeriod=600 Mar 07 21:28:16.052894 master-0 kubenswrapper[16352]: I0307 21:28:16.052791 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="thanos-sidecar" containerID="cri-o://bf67172d49ecbb34d6bd13a173d3bb7089a0a5c3c57b4c66dd6aa4c8fd2f11fc" gracePeriod=600 Mar 07 21:28:16.052945 master-0 kubenswrapper[16352]: I0307 21:28:16.052639 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-web" containerID="cri-o://ccdc3057f0c5390145addb4536e286a11c76db79587c90eacf6c9d0ef0c38b2e" gracePeriod=600 Mar 07 21:28:16.052989 master-0 kubenswrapper[16352]: I0307 21:28:16.052552 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-thanos" containerID="cri-o://5e2780e2e7c98c2c748c388c7ed2efb0b206c5e5cf8c84e599be58fe861624bd" gracePeriod=600 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.065273 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.065800 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="alertmanager" containerID="cri-o://8990d92d7314bd8c6c9472dce3afd8d9a5d4579a2e23086a7bfdf4b6e779d5de" gracePeriod=120 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.065979 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="prom-label-proxy" containerID="cri-o://df0305c3b23c7fc5354180436d83231087c4a341c296678d6e5b76e30d0f72c4" gracePeriod=120 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.066055 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-metric" containerID="cri-o://e4adb18ca8c14077d35457f3a2cf7fcda5afb5906a69465cfa0e4c206ff04578" gracePeriod=120 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.066094 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy" containerID="cri-o://7d4a3171cf827d2bd252bd67d3527faaeb48ec4ad32b82c909c0de55f87057fa" gracePeriod=120 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.066148 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-web" containerID="cri-o://8b7b3e87871f954bd434780d4486a3763a32e7e97d8bda5a8f30d82a22dc54fa" gracePeriod=120 Mar 07 21:28:16.066590 master-0 kubenswrapper[16352]: I0307 21:28:16.066186 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="config-reloader" containerID="cri-o://a53a9a1071ac13a88d1951a193326d32b83a1c1780abad0e0dafbb3804cc8bca" gracePeriod=120 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621224 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="5e2780e2e7c98c2c748c388c7ed2efb0b206c5e5cf8c84e599be58fe861624bd" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621283 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="2838178b4f1ff502863f7dcc3d2546a40d8338e52a2be0345ddd81cb81b6cfa5" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621292 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="ccdc3057f0c5390145addb4536e286a11c76db79587c90eacf6c9d0ef0c38b2e" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621301 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="bf67172d49ecbb34d6bd13a173d3bb7089a0a5c3c57b4c66dd6aa4c8fd2f11fc" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621310 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="a8467b33ca9f402e1c71e8d3f6e98a801a76f328df657df0db479d82b62f50e3" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621318 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerID="80d8dc37b0ac916e93ffc8cf045352f9940999409a2ec6b57f769d6ce37829e8" exitCode=0 Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621385 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"5e2780e2e7c98c2c748c388c7ed2efb0b206c5e5cf8c84e599be58fe861624bd"} Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621422 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"2838178b4f1ff502863f7dcc3d2546a40d8338e52a2be0345ddd81cb81b6cfa5"} Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621433 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"ccdc3057f0c5390145addb4536e286a11c76db79587c90eacf6c9d0ef0c38b2e"} Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621442 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"bf67172d49ecbb34d6bd13a173d3bb7089a0a5c3c57b4c66dd6aa4c8fd2f11fc"} Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621452 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"a8467b33ca9f402e1c71e8d3f6e98a801a76f328df657df0db479d82b62f50e3"} Mar 07 21:28:16.621815 master-0 kubenswrapper[16352]: I0307 21:28:16.621461 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"80d8dc37b0ac916e93ffc8cf045352f9940999409a2ec6b57f769d6ce37829e8"} Mar 07 21:28:16.629917 master-0 kubenswrapper[16352]: I0307 21:28:16.629871 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="df0305c3b23c7fc5354180436d83231087c4a341c296678d6e5b76e30d0f72c4" exitCode=0 Mar 07 21:28:16.631061 master-0 kubenswrapper[16352]: I0307 21:28:16.629906 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="e4adb18ca8c14077d35457f3a2cf7fcda5afb5906a69465cfa0e4c206ff04578" exitCode=0 Mar 07 21:28:16.631061 master-0 kubenswrapper[16352]: I0307 21:28:16.631037 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="7d4a3171cf827d2bd252bd67d3527faaeb48ec4ad32b82c909c0de55f87057fa" exitCode=0 Mar 07 21:28:16.631061 master-0 kubenswrapper[16352]: I0307 21:28:16.631047 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="8b7b3e87871f954bd434780d4486a3763a32e7e97d8bda5a8f30d82a22dc54fa" exitCode=0 Mar 07 21:28:16.631061 master-0 kubenswrapper[16352]: I0307 21:28:16.631055 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="a53a9a1071ac13a88d1951a193326d32b83a1c1780abad0e0dafbb3804cc8bca" exitCode=0 Mar 07 21:28:16.631061 master-0 kubenswrapper[16352]: I0307 21:28:16.631064 16352 generic.go:334] "Generic (PLEG): container finished" podID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerID="8990d92d7314bd8c6c9472dce3afd8d9a5d4579a2e23086a7bfdf4b6e779d5de" exitCode=0 Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631083 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"df0305c3b23c7fc5354180436d83231087c4a341c296678d6e5b76e30d0f72c4"} Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631110 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"e4adb18ca8c14077d35457f3a2cf7fcda5afb5906a69465cfa0e4c206ff04578"} Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631120 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"7d4a3171cf827d2bd252bd67d3527faaeb48ec4ad32b82c909c0de55f87057fa"} Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631130 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"8b7b3e87871f954bd434780d4486a3763a32e7e97d8bda5a8f30d82a22dc54fa"} Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631140 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"a53a9a1071ac13a88d1951a193326d32b83a1c1780abad0e0dafbb3804cc8bca"} Mar 07 21:28:16.631409 master-0 kubenswrapper[16352]: I0307 21:28:16.631149 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"8990d92d7314bd8c6c9472dce3afd8d9a5d4579a2e23086a7bfdf4b6e779d5de"} Mar 07 21:28:16.791065 master-0 kubenswrapper[16352]: I0307 21:28:16.790997 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:28:16.835061 master-0 kubenswrapper[16352]: I0307 21:28:16.834988 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:28:16.851251 master-0 kubenswrapper[16352]: I0307 21:28:16.851157 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw7xl\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851507 master-0 kubenswrapper[16352]: I0307 21:28:16.851273 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851507 master-0 kubenswrapper[16352]: I0307 21:28:16.851380 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851507 master-0 kubenswrapper[16352]: I0307 21:28:16.851423 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851667 master-0 kubenswrapper[16352]: I0307 21:28:16.851527 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851667 master-0 kubenswrapper[16352]: I0307 21:28:16.851569 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851667 master-0 kubenswrapper[16352]: I0307 21:28:16.851604 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851667 master-0 kubenswrapper[16352]: I0307 21:28:16.851651 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851667 master-0 kubenswrapper[16352]: I0307 21:28:16.851675 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851953 master-0 kubenswrapper[16352]: I0307 21:28:16.851720 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851953 master-0 kubenswrapper[16352]: I0307 21:28:16.851795 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.851953 master-0 kubenswrapper[16352]: I0307 21:28:16.851837 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric\") pod \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\" (UID: \"65a24af7-ab85-4c88-ab84-c98d1b4efa88\") " Mar 07 21:28:16.853508 master-0 kubenswrapper[16352]: I0307 21:28:16.853419 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:28:16.858703 master-0 kubenswrapper[16352]: I0307 21:28:16.854419 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.858703 master-0 kubenswrapper[16352]: I0307 21:28:16.854939 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.858703 master-0 kubenswrapper[16352]: I0307 21:28:16.855874 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.858904 master-0 kubenswrapper[16352]: I0307 21:28:16.858716 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume" (OuterVolumeSpecName: "config-volume") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.860975 master-0 kubenswrapper[16352]: I0307 21:28:16.859577 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.861905 master-0 kubenswrapper[16352]: I0307 21:28:16.861854 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl" (OuterVolumeSpecName: "kube-api-access-dw7xl") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "kube-api-access-dw7xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:16.862860 master-0 kubenswrapper[16352]: I0307 21:28:16.862813 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:16.862975 master-0 kubenswrapper[16352]: I0307 21:28:16.862916 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.864544 master-0 kubenswrapper[16352]: I0307 21:28:16.864464 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.865040 master-0 kubenswrapper[16352]: I0307 21:28:16.864953 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out" (OuterVolumeSpecName: "config-out") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:28:16.910074 master-0 kubenswrapper[16352]: I0307 21:28:16.909917 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config" (OuterVolumeSpecName: "web-config") pod "65a24af7-ab85-4c88-ab84-c98d1b4efa88" (UID: "65a24af7-ab85-4c88-ab84-c98d1b4efa88"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.953075 master-0 kubenswrapper[16352]: I0307 21:28:16.952976 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953343 master-0 kubenswrapper[16352]: I0307 21:28:16.953106 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953343 master-0 kubenswrapper[16352]: I0307 21:28:16.953245 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953343 master-0 kubenswrapper[16352]: I0307 21:28:16.953287 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953343 master-0 kubenswrapper[16352]: I0307 21:28:16.953315 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953555 master-0 kubenswrapper[16352]: I0307 21:28:16.953356 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953555 master-0 kubenswrapper[16352]: I0307 21:28:16.953385 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953555 master-0 kubenswrapper[16352]: I0307 21:28:16.953459 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953555 master-0 kubenswrapper[16352]: I0307 21:28:16.953488 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953762 master-0 kubenswrapper[16352]: I0307 21:28:16.953578 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953762 master-0 kubenswrapper[16352]: I0307 21:28:16.953636 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953762 master-0 kubenswrapper[16352]: I0307 21:28:16.953665 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953762 master-0 kubenswrapper[16352]: I0307 21:28:16.953733 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953933 master-0 kubenswrapper[16352]: I0307 21:28:16.953806 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953933 master-0 kubenswrapper[16352]: I0307 21:28:16.953855 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.953933 master-0 kubenswrapper[16352]: I0307 21:28:16.953898 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bcp\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.955132 master-0 kubenswrapper[16352]: I0307 21:28:16.954591 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.955132 master-0 kubenswrapper[16352]: I0307 21:28:16.954775 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy\") pod \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\" (UID: \"c7ca1461-37ed-4e6b-a289-9f3249d52a24\") " Mar 07 21:28:16.955806 master-0 kubenswrapper[16352]: I0307 21:28:16.955748 16352 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957093 16352 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-metric\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957127 16352 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-out\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957142 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw7xl\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-kube-api-access-dw7xl\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957154 16352 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-config-volume\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957164 16352 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-main-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957173 16352 reconciler_common.go:293] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-secret-alertmanager-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957176 master-0 kubenswrapper[16352]: I0307 21:28:16.957182 16352 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/65a24af7-ab85-4c88-ab84-c98d1b4efa88-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957784 master-0 kubenswrapper[16352]: I0307 21:28:16.957193 16352 reconciler_common.go:293] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-main-db\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957784 master-0 kubenswrapper[16352]: I0307 21:28:16.957204 16352 reconciler_common.go:293] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-alertmanager-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957784 master-0 kubenswrapper[16352]: I0307 21:28:16.957213 16352 reconciler_common.go:293] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/65a24af7-ab85-4c88-ab84-c98d1b4efa88-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957784 master-0 kubenswrapper[16352]: I0307 21:28:16.957222 16352 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/65a24af7-ab85-4c88-ab84-c98d1b4efa88-web-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:16.957784 master-0 kubenswrapper[16352]: I0307 21:28:16.956629 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.958949 master-0 kubenswrapper[16352]: I0307 21:28:16.958330 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config" (OuterVolumeSpecName: "config") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.958949 master-0 kubenswrapper[16352]: I0307 21:28:16.958719 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out" (OuterVolumeSpecName: "config-out") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:28:16.959275 master-0 kubenswrapper[16352]: I0307 21:28:16.959173 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.960970 master-0 kubenswrapper[16352]: I0307 21:28:16.960881 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp" (OuterVolumeSpecName: "kube-api-access-h6bcp") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "kube-api-access-h6bcp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:16.961739 master-0 kubenswrapper[16352]: I0307 21:28:16.961704 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:16.962020 master-0 kubenswrapper[16352]: I0307 21:28:16.961937 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.962121 master-0 kubenswrapper[16352]: I0307 21:28:16.961971 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.962175 master-0 kubenswrapper[16352]: I0307 21:28:16.962143 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.962442 master-0 kubenswrapper[16352]: I0307 21:28:16.962372 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.962590 master-0 kubenswrapper[16352]: I0307 21:28:16.962532 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.964084 master-0 kubenswrapper[16352]: I0307 21:28:16.964018 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.964746 master-0 kubenswrapper[16352]: I0307 21:28:16.964626 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:16.965728 master-0 kubenswrapper[16352]: I0307 21:28:16.965652 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:28:16.966239 master-0 kubenswrapper[16352]: I0307 21:28:16.966190 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.966296 master-0 kubenswrapper[16352]: I0307 21:28:16.966232 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:16.968668 master-0 kubenswrapper[16352]: I0307 21:28:16.968618 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:17.026088 master-0 kubenswrapper[16352]: I0307 21:28:17.025904 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config" (OuterVolumeSpecName: "web-config") pod "c7ca1461-37ed-4e6b-a289-9f3249d52a24" (UID: "c7ca1461-37ed-4e6b-a289-9f3249d52a24"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059151 16352 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-db\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059221 16352 reconciler_common.go:293] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-metrics-client-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059241 16352 reconciler_common.go:293] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-serving-certs-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059258 16352 reconciler_common.go:293] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059273 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059286 16352 reconciler_common.go:293] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-grpc-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059298 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bcp\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-kube-api-access-h6bcp\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059310 16352 reconciler_common.go:293] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/c7ca1461-37ed-4e6b-a289-9f3249d52a24-config-out\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059321 16352 reconciler_common.go:293] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-kube-rbac-proxy\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059333 16352 reconciler_common.go:293] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-prometheus-k8s-rulefiles-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059346 16352 reconciler_common.go:293] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-kubelet-serving-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059360 16352 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059376 16352 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-tls\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059389 16352 reconciler_common.go:293] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/c7ca1461-37ed-4e6b-a289-9f3249d52a24-configmap-metrics-client-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059402 16352 reconciler_common.go:293] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059415 16352 reconciler_common.go:293] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-thanos-prometheus-http-client-file\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059433 16352 reconciler_common.go:293] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/c7ca1461-37ed-4e6b-a289-9f3249d52a24-web-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.060049 master-0 kubenswrapper[16352]: I0307 21:28:17.059442 16352 reconciler_common.go:293] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/c7ca1461-37ed-4e6b-a289-9f3249d52a24-tls-assets\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:17.264172 master-0 kubenswrapper[16352]: E0307 21:28:17.264018 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc7ca1461_37ed_4e6b_a289_9f3249d52a24.slice/crio-0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca\": RecentStats: unable to find data in memory cache]" Mar 07 21:28:17.648599 master-0 kubenswrapper[16352]: I0307 21:28:17.648528 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:28:17.648902 master-0 kubenswrapper[16352]: I0307 21:28:17.648423 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"c7ca1461-37ed-4e6b-a289-9f3249d52a24","Type":"ContainerDied","Data":"0e0cb4ab6408b81aa5d639c8fafe19c4036ee4f4337d54a21f5c642bab0143ca"} Mar 07 21:28:17.648902 master-0 kubenswrapper[16352]: I0307 21:28:17.648870 16352 scope.go:117] "RemoveContainer" containerID="5e2780e2e7c98c2c748c388c7ed2efb0b206c5e5cf8c84e599be58fe861624bd" Mar 07 21:28:17.655922 master-0 kubenswrapper[16352]: I0307 21:28:17.655576 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"65a24af7-ab85-4c88-ab84-c98d1b4efa88","Type":"ContainerDied","Data":"f35b2f615a6cdc98ab3a5e215d01153cd89ba8361e71f5dc88ede17f7c042fc2"} Mar 07 21:28:17.655922 master-0 kubenswrapper[16352]: I0307 21:28:17.655744 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:28:17.677057 master-0 kubenswrapper[16352]: I0307 21:28:17.676989 16352 scope.go:117] "RemoveContainer" containerID="2838178b4f1ff502863f7dcc3d2546a40d8338e52a2be0345ddd81cb81b6cfa5" Mar 07 21:28:17.694162 master-0 kubenswrapper[16352]: I0307 21:28:17.694066 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:28:17.697825 master-0 kubenswrapper[16352]: I0307 21:28:17.697715 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:28:17.706581 master-0 kubenswrapper[16352]: I0307 21:28:17.706486 16352 scope.go:117] "RemoveContainer" containerID="ccdc3057f0c5390145addb4536e286a11c76db79587c90eacf6c9d0ef0c38b2e" Mar 07 21:28:17.717199 master-0 kubenswrapper[16352]: I0307 21:28:17.716730 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:28:17.721651 master-0 kubenswrapper[16352]: I0307 21:28:17.721142 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:28:17.733651 master-0 kubenswrapper[16352]: I0307 21:28:17.733586 16352 scope.go:117] "RemoveContainer" containerID="bf67172d49ecbb34d6bd13a173d3bb7089a0a5c3c57b4c66dd6aa4c8fd2f11fc" Mar 07 21:28:17.759192 master-0 kubenswrapper[16352]: I0307 21:28:17.759111 16352 scope.go:117] "RemoveContainer" containerID="a8467b33ca9f402e1c71e8d3f6e98a801a76f328df657df0db479d82b62f50e3" Mar 07 21:28:17.800913 master-0 kubenswrapper[16352]: I0307 21:28:17.800851 16352 scope.go:117] "RemoveContainer" containerID="80d8dc37b0ac916e93ffc8cf045352f9940999409a2ec6b57f769d6ce37829e8" Mar 07 21:28:17.830161 master-0 kubenswrapper[16352]: I0307 21:28:17.830114 16352 scope.go:117] "RemoveContainer" containerID="5a22995b60cfc534ab9fd6af1093820ee79bf42f1f476f345c1e976f3a3fc80b" Mar 07 21:28:17.857896 master-0 kubenswrapper[16352]: I0307 21:28:17.857813 16352 scope.go:117] "RemoveContainer" containerID="df0305c3b23c7fc5354180436d83231087c4a341c296678d6e5b76e30d0f72c4" Mar 07 21:28:17.882936 master-0 kubenswrapper[16352]: I0307 21:28:17.882891 16352 scope.go:117] "RemoveContainer" containerID="e4adb18ca8c14077d35457f3a2cf7fcda5afb5906a69465cfa0e4c206ff04578" Mar 07 21:28:17.903188 master-0 kubenswrapper[16352]: I0307 21:28:17.902472 16352 scope.go:117] "RemoveContainer" containerID="7d4a3171cf827d2bd252bd67d3527faaeb48ec4ad32b82c909c0de55f87057fa" Mar 07 21:28:17.925310 master-0 kubenswrapper[16352]: I0307 21:28:17.924912 16352 scope.go:117] "RemoveContainer" containerID="8b7b3e87871f954bd434780d4486a3763a32e7e97d8bda5a8f30d82a22dc54fa" Mar 07 21:28:17.949116 master-0 kubenswrapper[16352]: I0307 21:28:17.948999 16352 scope.go:117] "RemoveContainer" containerID="a53a9a1071ac13a88d1951a193326d32b83a1c1780abad0e0dafbb3804cc8bca" Mar 07 21:28:17.969085 master-0 kubenswrapper[16352]: I0307 21:28:17.969020 16352 scope.go:117] "RemoveContainer" containerID="8990d92d7314bd8c6c9472dce3afd8d9a5d4579a2e23086a7bfdf4b6e779d5de" Mar 07 21:28:17.986879 master-0 kubenswrapper[16352]: I0307 21:28:17.986818 16352 scope.go:117] "RemoveContainer" containerID="2a0b5efee8d9bea443d0c78f75b1bbd14c05bdb0d02fbf32cc6350f09b3b5043" Mar 07 21:28:18.576743 master-0 kubenswrapper[16352]: I0307 21:28:18.576662 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:28:18.576743 master-0 kubenswrapper[16352]: I0307 21:28:18.576732 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:28:18.577335 master-0 kubenswrapper[16352]: I0307 21:28:18.577206 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:18.577335 master-0 kubenswrapper[16352]: I0307 21:28:18.577288 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:28:19.200520 master-0 kubenswrapper[16352]: I0307 21:28:19.200433 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" path="/var/lib/kubelet/pods/65a24af7-ab85-4c88-ab84-c98d1b4efa88/volumes" Mar 07 21:28:19.201638 master-0 kubenswrapper[16352]: I0307 21:28:19.201594 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" path="/var/lib/kubelet/pods/c7ca1461-37ed-4e6b-a289-9f3249d52a24/volumes" Mar 07 21:28:19.967674 master-0 kubenswrapper[16352]: I0307 21:28:19.966009 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:28:21.218074 master-0 kubenswrapper[16352]: I0307 21:28:21.217862 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-master-0"] Mar 07 21:28:25.190135 master-0 kubenswrapper[16352]: I0307 21:28:25.190045 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:25.552362 master-0 kubenswrapper[16352]: I0307 21:28:25.552251 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:28:25.552876 master-0 kubenswrapper[16352]: I0307 21:28:25.552805 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" containerID="cri-o://29fe93c228bd77fd76218d416ea847bb6245a51f2342d94714a2572a13bb2ff1" gracePeriod=30 Mar 07 21:28:25.553113 master-0 kubenswrapper[16352]: I0307 21:28:25.553036 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" containerID="cri-o://49cc11a235efe78997a02668cffbda8c251aec39c02e2f7118030908ead8c408" gracePeriod=30 Mar 07 21:28:25.553208 master-0 kubenswrapper[16352]: I0307 21:28:25.553125 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" containerID="cri-o://95f51830d903c41c8ee7ab7a8d7de2c678711e9e11302d94e3d2db00f6dd7437" gracePeriod=30 Mar 07 21:28:25.554446 master-0 kubenswrapper[16352]: I0307 21:28:25.554369 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:28:25.555075 master-0 kubenswrapper[16352]: E0307 21:28:25.555024 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.555075 master-0 kubenswrapper[16352]: I0307 21:28:25.555052 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.555075 master-0 kubenswrapper[16352]: E0307 21:28:25.555071 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="alertmanager" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: I0307 21:28:25.555086 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="alertmanager" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: E0307 21:28:25.555104 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="prometheus" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: I0307 21:28:25.555117 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="prometheus" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: E0307 21:28:25.555134 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="config-reloader" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: I0307 21:28:25.555150 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="config-reloader" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: E0307 21:28:25.555186 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: I0307 21:28:25.555204 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: E0307 21:28:25.555246 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 07 21:28:25.555282 master-0 kubenswrapper[16352]: I0307 21:28:25.555265 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555298 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-metric" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555318 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-metric" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555357 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555375 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="wait-for-host-port" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555406 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555423 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555463 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555518 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555561 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555576 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555600 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="init-config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555614 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="init-config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555644 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555662 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555729 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555752 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555783 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="thanos-sidecar" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555844 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="thanos-sidecar" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555876 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="init-config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555895 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="init-config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555920 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555937 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.555960 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-thanos" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.555974 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-thanos" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: E0307 21:28:25.556020 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="prom-label-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556034 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="prom-label-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556272 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556310 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="alertmanager" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556327 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556349 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="prom-label-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556371 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="thanos-sidecar" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556389 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556407 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-recovery-controller" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556438 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="prometheus" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556456 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556479 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-metric" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556577 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="config-reloader" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556610 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-thanos" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556638 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556667 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="65a24af7-ab85-4c88-ab84-c98d1b4efa88" containerName="kube-rbac-proxy-web" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556728 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7ca1461-37ed-4e6b-a289-9f3249d52a24" containerName="kube-rbac-proxy" Mar 07 21:28:25.557724 master-0 kubenswrapper[16352]: I0307 21:28:25.556758 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d3d45b6ce1b3764f9927e623a71adf8" containerName="kube-scheduler-cert-syncer" Mar 07 21:28:25.623716 master-0 kubenswrapper[16352]: I0307 21:28:25.623641 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.623978 master-0 kubenswrapper[16352]: I0307 21:28:25.623744 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.725936 master-0 kubenswrapper[16352]: I0307 21:28:25.725718 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.725936 master-0 kubenswrapper[16352]: I0307 21:28:25.725936 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.726248 master-0 kubenswrapper[16352]: I0307 21:28:25.725997 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-resource-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.726248 master-0 kubenswrapper[16352]: I0307 21:28:25.726073 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1453f6461bf5d599ad65a4656343ee91-cert-dir\") pod \"openshift-kube-scheduler-master-0\" (UID: \"1453f6461bf5d599ad65a4656343ee91\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.785472 master-0 kubenswrapper[16352]: I0307 21:28:25.785399 16352 generic.go:334] "Generic (PLEG): container finished" podID="a69242fc-53d6-48f5-82a9-52daf194d047" containerID="0e46543b367b8d876d0a1dfbab15863e93b3021665df29db594c1bdd53d219ed" exitCode=0 Mar 07 21:28:25.785612 master-0 kubenswrapper[16352]: I0307 21:28:25.785484 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" event={"ID":"a69242fc-53d6-48f5-82a9-52daf194d047","Type":"ContainerDied","Data":"0e46543b367b8d876d0a1dfbab15863e93b3021665df29db594c1bdd53d219ed"} Mar 07 21:28:25.787980 master-0 kubenswrapper[16352]: I0307 21:28:25.787946 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 07 21:28:25.788964 master-0 kubenswrapper[16352]: I0307 21:28:25.788925 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 07 21:28:25.790605 master-0 kubenswrapper[16352]: I0307 21:28:25.790554 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler/0.log" Mar 07 21:28:25.791391 master-0 kubenswrapper[16352]: I0307 21:28:25.791366 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="49cc11a235efe78997a02668cffbda8c251aec39c02e2f7118030908ead8c408" exitCode=0 Mar 07 21:28:25.791391 master-0 kubenswrapper[16352]: I0307 21:28:25.791388 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="95f51830d903c41c8ee7ab7a8d7de2c678711e9e11302d94e3d2db00f6dd7437" exitCode=0 Mar 07 21:28:25.791486 master-0 kubenswrapper[16352]: I0307 21:28:25.791400 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d3d45b6ce1b3764f9927e623a71adf8" containerID="29fe93c228bd77fd76218d416ea847bb6245a51f2342d94714a2572a13bb2ff1" exitCode=2 Mar 07 21:28:25.791486 master-0 kubenswrapper[16352]: I0307 21:28:25.791459 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6ec220a16a4414ebbd87f4036017b07cef5c4b07da82f02aa544a2d6c79d687e" Mar 07 21:28:25.791486 master-0 kubenswrapper[16352]: I0307 21:28:25.791480 16352 scope.go:117] "RemoveContainer" containerID="75482995cc5f55d9d7fb4b8a57bf5ec36cbaac14083b2719abeb4a1eb62846bc" Mar 07 21:28:25.794330 master-0 kubenswrapper[16352]: I0307 21:28:25.794296 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/3.log" Mar 07 21:28:25.795974 master-0 kubenswrapper[16352]: I0307 21:28:25.795949 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:28:25.796058 master-0 kubenswrapper[16352]: I0307 21:28:25.795997 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"1c6f1e263aa1f0a5ac95d2a74e2c146c","Type":"ContainerStarted","Data":"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3"} Mar 07 21:28:25.852577 master-0 kubenswrapper[16352]: I0307 21:28:25.852468 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-master-0" podStartSLOduration=4.852441012 podStartE2EDuration="4.852441012s" podCreationTimestamp="2026-03-07 21:28:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:28:25.837782976 +0000 UTC m=+628.908488075" watchObservedRunningTime="2026-03-07 21:28:25.852441012 +0000 UTC m=+628.923146071" Mar 07 21:28:25.883533 master-0 kubenswrapper[16352]: I0307 21:28:25.883466 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 07 21:28:25.884116 master-0 kubenswrapper[16352]: I0307 21:28:25.884074 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:25.890663 master-0 kubenswrapper[16352]: I0307 21:28:25.890575 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 07 21:28:25.929635 master-0 kubenswrapper[16352]: I0307 21:28:25.929557 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 07 21:28:25.929984 master-0 kubenswrapper[16352]: I0307 21:28:25.929661 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") pod \"1d3d45b6ce1b3764f9927e623a71adf8\" (UID: \"1d3d45b6ce1b3764f9927e623a71adf8\") " Mar 07 21:28:25.929984 master-0 kubenswrapper[16352]: I0307 21:28:25.929751 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:25.929984 master-0 kubenswrapper[16352]: I0307 21:28:25.929892 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "1d3d45b6ce1b3764f9927e623a71adf8" (UID: "1d3d45b6ce1b3764f9927e623a71adf8"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:25.930112 master-0 kubenswrapper[16352]: I0307 21:28:25.930081 16352 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:25.930112 master-0 kubenswrapper[16352]: I0307 21:28:25.930102 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1d3d45b6ce1b3764f9927e623a71adf8-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:26.813325 master-0 kubenswrapper[16352]: I0307 21:28:26.813120 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-scheduler_openshift-kube-scheduler-master-0_1d3d45b6ce1b3764f9927e623a71adf8/kube-scheduler-cert-syncer/0.log" Mar 07 21:28:26.815154 master-0 kubenswrapper[16352]: I0307 21:28:26.815092 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:26.820099 master-0 kubenswrapper[16352]: I0307 21:28:26.819973 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 07 21:28:26.856845 master-0 kubenswrapper[16352]: I0307 21:28:26.856749 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" oldPodUID="1d3d45b6ce1b3764f9927e623a71adf8" podUID="1453f6461bf5d599ad65a4656343ee91" Mar 07 21:28:27.150587 master-0 kubenswrapper[16352]: I0307 21:28:27.150468 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:28:27.152227 master-0 kubenswrapper[16352]: I0307 21:28:27.152135 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-cert-syncer" containerID="cri-o://3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" gracePeriod=30 Mar 07 21:28:27.152620 master-0 kubenswrapper[16352]: I0307 21:28:27.152549 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" containerID="cri-o://4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" gracePeriod=30 Mar 07 21:28:27.152792 master-0 kubenswrapper[16352]: I0307 21:28:27.152731 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" containerID="cri-o://51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" gracePeriod=30 Mar 07 21:28:27.152912 master-0 kubenswrapper[16352]: I0307 21:28:27.152866 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-recovery-controller" containerID="cri-o://91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" gracePeriod=30 Mar 07 21:28:27.154015 master-0 kubenswrapper[16352]: I0307 21:28:27.153941 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:28:27.154631 master-0 kubenswrapper[16352]: E0307 21:28:27.154465 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.154804 master-0 kubenswrapper[16352]: I0307 21:28:27.154667 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.154916 master-0 kubenswrapper[16352]: E0307 21:28:27.154874 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-cert-syncer" Mar 07 21:28:27.154997 master-0 kubenswrapper[16352]: I0307 21:28:27.154916 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-cert-syncer" Mar 07 21:28:27.154997 master-0 kubenswrapper[16352]: E0307 21:28:27.154943 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.154997 master-0 kubenswrapper[16352]: I0307 21:28:27.154963 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.154997 master-0 kubenswrapper[16352]: E0307 21:28:27.154995 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.155233 master-0 kubenswrapper[16352]: I0307 21:28:27.155016 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.155233 master-0 kubenswrapper[16352]: E0307 21:28:27.155049 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.155233 master-0 kubenswrapper[16352]: I0307 21:28:27.155070 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.155233 master-0 kubenswrapper[16352]: E0307 21:28:27.155132 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-recovery-controller" Mar 07 21:28:27.155480 master-0 kubenswrapper[16352]: I0307 21:28:27.155243 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-recovery-controller" Mar 07 21:28:27.155480 master-0 kubenswrapper[16352]: E0307 21:28:27.155286 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.155480 master-0 kubenswrapper[16352]: I0307 21:28:27.155308 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.155752 master-0 kubenswrapper[16352]: I0307 21:28:27.155660 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.155846 master-0 kubenswrapper[16352]: I0307 21:28:27.155753 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.155846 master-0 kubenswrapper[16352]: I0307 21:28:27.155797 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.155846 master-0 kubenswrapper[16352]: I0307 21:28:27.155826 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-cert-syncer" Mar 07 21:28:27.156015 master-0 kubenswrapper[16352]: I0307 21:28:27.155886 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156015 master-0 kubenswrapper[16352]: I0307 21:28:27.155912 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156015 master-0 kubenswrapper[16352]: I0307 21:28:27.155956 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager-recovery-controller" Mar 07 21:28:27.156393 master-0 kubenswrapper[16352]: E0307 21:28:27.156333 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156393 master-0 kubenswrapper[16352]: I0307 21:28:27.156380 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156522 master-0 kubenswrapper[16352]: E0307 21:28:27.156431 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156522 master-0 kubenswrapper[16352]: I0307 21:28:27.156453 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.156853 master-0 kubenswrapper[16352]: I0307 21:28:27.156786 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="kube-controller-manager" Mar 07 21:28:27.156974 master-0 kubenswrapper[16352]: I0307 21:28:27.156855 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerName="cluster-policy-controller" Mar 07 21:28:27.220767 master-0 kubenswrapper[16352]: I0307 21:28:27.214588 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d3d45b6ce1b3764f9927e623a71adf8" path="/var/lib/kubelet/pods/1d3d45b6ce1b3764f9927e623a71adf8/volumes" Mar 07 21:28:27.227207 master-0 kubenswrapper[16352]: I0307 21:28:27.225240 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" podUID="0603fd6b1c0835ce0492441d0e22b91c" Mar 07 21:28:27.261839 master-0 kubenswrapper[16352]: I0307 21:28:27.261748 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.262201 master-0 kubenswrapper[16352]: I0307 21:28:27.262097 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.364086 master-0 kubenswrapper[16352]: I0307 21:28:27.363749 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.364086 master-0 kubenswrapper[16352]: I0307 21:28:27.363888 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.364086 master-0 kubenswrapper[16352]: I0307 21:28:27.363927 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-cert-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.364086 master-0 kubenswrapper[16352]: I0307 21:28:27.364001 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/0603fd6b1c0835ce0492441d0e22b91c-resource-dir\") pod \"kube-controller-manager-master-0\" (UID: \"0603fd6b1c0835ce0492441d0e22b91c\") " pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.474512 master-0 kubenswrapper[16352]: I0307 21:28:27.474362 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/3.log" Mar 07 21:28:27.476759 master-0 kubenswrapper[16352]: I0307 21:28:27.476717 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager-cert-syncer/0.log" Mar 07 21:28:27.477059 master-0 kubenswrapper[16352]: E0307 21:28:27.477021 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-pod5be2b4e6_94a5_4282_ba6f_86dc7634a28d.slice/crio-conmon-91ae59cbf2b5cf1a678305d658269b272dfb65a219a968058791a1ac204a5668.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-pod5be2b4e6_94a5_4282_ba6f_86dc7634a28d.slice/crio-91ae59cbf2b5cf1a678305d658269b272dfb65a219a968058791a1ac204a5668.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:28:27.477669 master-0 kubenswrapper[16352]: I0307 21:28:27.477633 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:28:27.477848 master-0 kubenswrapper[16352]: I0307 21:28:27.477806 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.478473 master-0 kubenswrapper[16352]: I0307 21:28:27.478445 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:28:27.482315 master-0 kubenswrapper[16352]: I0307 21:28:27.482191 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" podUID="0603fd6b1c0835ce0492441d0e22b91c" Mar 07 21:28:27.566447 master-0 kubenswrapper[16352]: I0307 21:28:27.566351 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock\") pod \"a69242fc-53d6-48f5-82a9-52daf194d047\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " Mar 07 21:28:27.566896 master-0 kubenswrapper[16352]: I0307 21:28:27.566496 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock" (OuterVolumeSpecName: "var-lock") pod "a69242fc-53d6-48f5-82a9-52daf194d047" (UID: "a69242fc-53d6-48f5-82a9-52daf194d047"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:27.566896 master-0 kubenswrapper[16352]: I0307 21:28:27.566580 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir\") pod \"1c6f1e263aa1f0a5ac95d2a74e2c146c\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " Mar 07 21:28:27.566896 master-0 kubenswrapper[16352]: I0307 21:28:27.566766 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access\") pod \"a69242fc-53d6-48f5-82a9-52daf194d047\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " Mar 07 21:28:27.567121 master-0 kubenswrapper[16352]: I0307 21:28:27.566931 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir\") pod \"1c6f1e263aa1f0a5ac95d2a74e2c146c\" (UID: \"1c6f1e263aa1f0a5ac95d2a74e2c146c\") " Mar 07 21:28:27.567121 master-0 kubenswrapper[16352]: I0307 21:28:27.566968 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir\") pod \"a69242fc-53d6-48f5-82a9-52daf194d047\" (UID: \"a69242fc-53d6-48f5-82a9-52daf194d047\") " Mar 07 21:28:27.567121 master-0 kubenswrapper[16352]: I0307 21:28:27.566763 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "1c6f1e263aa1f0a5ac95d2a74e2c146c" (UID: "1c6f1e263aa1f0a5ac95d2a74e2c146c"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:27.567121 master-0 kubenswrapper[16352]: I0307 21:28:27.567087 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "1c6f1e263aa1f0a5ac95d2a74e2c146c" (UID: "1c6f1e263aa1f0a5ac95d2a74e2c146c"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:27.567402 master-0 kubenswrapper[16352]: I0307 21:28:27.567127 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "a69242fc-53d6-48f5-82a9-52daf194d047" (UID: "a69242fc-53d6-48f5-82a9-52daf194d047"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:27.567571 master-0 kubenswrapper[16352]: I0307 21:28:27.567520 16352 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:27.567640 master-0 kubenswrapper[16352]: I0307 21:28:27.567561 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:27.567640 master-0 kubenswrapper[16352]: I0307 21:28:27.567594 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/a69242fc-53d6-48f5-82a9-52daf194d047-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:27.567640 master-0 kubenswrapper[16352]: I0307 21:28:27.567621 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/1c6f1e263aa1f0a5ac95d2a74e2c146c-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:27.570305 master-0 kubenswrapper[16352]: I0307 21:28:27.570260 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "a69242fc-53d6-48f5-82a9-52daf194d047" (UID: "a69242fc-53d6-48f5-82a9-52daf194d047"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:27.670229 master-0 kubenswrapper[16352]: I0307 21:28:27.670142 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/a69242fc-53d6-48f5-82a9-52daf194d047-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:27.829480 master-0 kubenswrapper[16352]: I0307 21:28:27.829253 16352 generic.go:334] "Generic (PLEG): container finished" podID="5be2b4e6-94a5-4282-ba6f-86dc7634a28d" containerID="91ae59cbf2b5cf1a678305d658269b272dfb65a219a968058791a1ac204a5668" exitCode=0 Mar 07 21:28:27.829480 master-0 kubenswrapper[16352]: I0307 21:28:27.829357 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"5be2b4e6-94a5-4282-ba6f-86dc7634a28d","Type":"ContainerDied","Data":"91ae59cbf2b5cf1a678305d658269b272dfb65a219a968058791a1ac204a5668"} Mar 07 21:28:27.832174 master-0 kubenswrapper[16352]: I0307 21:28:27.832039 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" event={"ID":"a69242fc-53d6-48f5-82a9-52daf194d047","Type":"ContainerDied","Data":"f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09"} Mar 07 21:28:27.832174 master-0 kubenswrapper[16352]: I0307 21:28:27.832123 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f90f6ca5e0e2250794f5f84ea98270a7ca527bad7f8c21be75568e19d081be09" Mar 07 21:28:27.832802 master-0 kubenswrapper[16352]: I0307 21:28:27.832628 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/installer-5-retry-1-master-0" Mar 07 21:28:27.835713 master-0 kubenswrapper[16352]: I0307 21:28:27.835636 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/cluster-policy-controller/3.log" Mar 07 21:28:27.837817 master-0 kubenswrapper[16352]: I0307 21:28:27.837781 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager-cert-syncer/0.log" Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.838781 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_1c6f1e263aa1f0a5ac95d2a74e2c146c/kube-controller-manager/0.log" Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.838871 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" exitCode=0 Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.838941 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" exitCode=0 Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.838964 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" exitCode=0 Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.838979 16352 generic.go:334] "Generic (PLEG): container finished" podID="1c6f1e263aa1f0a5ac95d2a74e2c146c" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" exitCode=2 Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.839015 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:27.838851 master-0 kubenswrapper[16352]: I0307 21:28:27.839049 16352 scope.go:117] "RemoveContainer" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:27.862461 master-0 kubenswrapper[16352]: I0307 21:28:27.862306 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" podUID="0603fd6b1c0835ce0492441d0e22b91c" Mar 07 21:28:27.876847 master-0 kubenswrapper[16352]: I0307 21:28:27.876738 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" oldPodUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" podUID="0603fd6b1c0835ce0492441d0e22b91c" Mar 07 21:28:27.877020 master-0 kubenswrapper[16352]: I0307 21:28:27.876919 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:27.909464 master-0 kubenswrapper[16352]: I0307 21:28:27.909285 16352 scope.go:117] "RemoveContainer" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:27.982467 master-0 kubenswrapper[16352]: I0307 21:28:27.982347 16352 scope.go:117] "RemoveContainer" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.004815 master-0 kubenswrapper[16352]: I0307 21:28:28.004741 16352 scope.go:117] "RemoveContainer" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.023324 master-0 kubenswrapper[16352]: I0307 21:28:28.023258 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.043972 master-0 kubenswrapper[16352]: I0307 21:28:28.043870 16352 scope.go:117] "RemoveContainer" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:28.044590 master-0 kubenswrapper[16352]: E0307 21:28:28.044534 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": container with ID starting with 4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3 not found: ID does not exist" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:28.044876 master-0 kubenswrapper[16352]: I0307 21:28:28.044590 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3"} err="failed to get container status \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": rpc error: code = NotFound desc = could not find container \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": container with ID starting with 4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3 not found: ID does not exist" Mar 07 21:28:28.044876 master-0 kubenswrapper[16352]: I0307 21:28:28.044654 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:28.045476 master-0 kubenswrapper[16352]: E0307 21:28:28.045401 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": container with ID starting with e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b not found: ID does not exist" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:28.045608 master-0 kubenswrapper[16352]: I0307 21:28:28.045466 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} err="failed to get container status \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": rpc error: code = NotFound desc = could not find container \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": container with ID starting with e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b not found: ID does not exist" Mar 07 21:28:28.045608 master-0 kubenswrapper[16352]: I0307 21:28:28.045508 16352 scope.go:117] "RemoveContainer" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:28.046140 master-0 kubenswrapper[16352]: E0307 21:28:28.046047 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": container with ID starting with 51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a not found: ID does not exist" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:28.046270 master-0 kubenswrapper[16352]: I0307 21:28:28.046150 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a"} err="failed to get container status \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": rpc error: code = NotFound desc = could not find container \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": container with ID starting with 51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a not found: ID does not exist" Mar 07 21:28:28.046270 master-0 kubenswrapper[16352]: I0307 21:28:28.046226 16352 scope.go:117] "RemoveContainer" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.046992 master-0 kubenswrapper[16352]: E0307 21:28:28.046930 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": container with ID starting with 91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9 not found: ID does not exist" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.046992 master-0 kubenswrapper[16352]: I0307 21:28:28.046968 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9"} err="failed to get container status \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": rpc error: code = NotFound desc = could not find container \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": container with ID starting with 91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9 not found: ID does not exist" Mar 07 21:28:28.046992 master-0 kubenswrapper[16352]: I0307 21:28:28.046990 16352 scope.go:117] "RemoveContainer" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.047632 master-0 kubenswrapper[16352]: E0307 21:28:28.047538 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": container with ID starting with 3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61 not found: ID does not exist" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.047632 master-0 kubenswrapper[16352]: I0307 21:28:28.047604 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61"} err="failed to get container status \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": rpc error: code = NotFound desc = could not find container \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": container with ID starting with 3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61 not found: ID does not exist" Mar 07 21:28:28.048178 master-0 kubenswrapper[16352]: I0307 21:28:28.047655 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.048517 master-0 kubenswrapper[16352]: E0307 21:28:28.048399 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": container with ID starting with 9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4 not found: ID does not exist" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.048517 master-0 kubenswrapper[16352]: I0307 21:28:28.048434 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} err="failed to get container status \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": rpc error: code = NotFound desc = could not find container \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": container with ID starting with 9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4 not found: ID does not exist" Mar 07 21:28:28.048517 master-0 kubenswrapper[16352]: I0307 21:28:28.048464 16352 scope.go:117] "RemoveContainer" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:28.049339 master-0 kubenswrapper[16352]: I0307 21:28:28.048894 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3"} err="failed to get container status \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": rpc error: code = NotFound desc = could not find container \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": container with ID starting with 4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3 not found: ID does not exist" Mar 07 21:28:28.049339 master-0 kubenswrapper[16352]: I0307 21:28:28.048948 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:28.049847 master-0 kubenswrapper[16352]: I0307 21:28:28.049622 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} err="failed to get container status \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": rpc error: code = NotFound desc = could not find container \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": container with ID starting with e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b not found: ID does not exist" Mar 07 21:28:28.049847 master-0 kubenswrapper[16352]: I0307 21:28:28.049670 16352 scope.go:117] "RemoveContainer" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:28.050364 master-0 kubenswrapper[16352]: I0307 21:28:28.050156 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a"} err="failed to get container status \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": rpc error: code = NotFound desc = could not find container \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": container with ID starting with 51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a not found: ID does not exist" Mar 07 21:28:28.050364 master-0 kubenswrapper[16352]: I0307 21:28:28.050192 16352 scope.go:117] "RemoveContainer" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.050898 master-0 kubenswrapper[16352]: I0307 21:28:28.050828 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9"} err="failed to get container status \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": rpc error: code = NotFound desc = could not find container \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": container with ID starting with 91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9 not found: ID does not exist" Mar 07 21:28:28.050898 master-0 kubenswrapper[16352]: I0307 21:28:28.050863 16352 scope.go:117] "RemoveContainer" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.051362 master-0 kubenswrapper[16352]: I0307 21:28:28.051279 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61"} err="failed to get container status \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": rpc error: code = NotFound desc = could not find container \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": container with ID starting with 3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61 not found: ID does not exist" Mar 07 21:28:28.051362 master-0 kubenswrapper[16352]: I0307 21:28:28.051314 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.052172 master-0 kubenswrapper[16352]: I0307 21:28:28.051978 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} err="failed to get container status \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": rpc error: code = NotFound desc = could not find container \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": container with ID starting with 9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4 not found: ID does not exist" Mar 07 21:28:28.052172 master-0 kubenswrapper[16352]: I0307 21:28:28.052006 16352 scope.go:117] "RemoveContainer" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:28.052440 master-0 kubenswrapper[16352]: I0307 21:28:28.052405 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3"} err="failed to get container status \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": rpc error: code = NotFound desc = could not find container \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": container with ID starting with 4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3 not found: ID does not exist" Mar 07 21:28:28.052491 master-0 kubenswrapper[16352]: I0307 21:28:28.052436 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:28.052959 master-0 kubenswrapper[16352]: I0307 21:28:28.052900 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} err="failed to get container status \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": rpc error: code = NotFound desc = could not find container \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": container with ID starting with e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b not found: ID does not exist" Mar 07 21:28:28.052959 master-0 kubenswrapper[16352]: I0307 21:28:28.052944 16352 scope.go:117] "RemoveContainer" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:28.053476 master-0 kubenswrapper[16352]: I0307 21:28:28.053435 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a"} err="failed to get container status \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": rpc error: code = NotFound desc = could not find container \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": container with ID starting with 51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a not found: ID does not exist" Mar 07 21:28:28.053476 master-0 kubenswrapper[16352]: I0307 21:28:28.053466 16352 scope.go:117] "RemoveContainer" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.054015 master-0 kubenswrapper[16352]: I0307 21:28:28.053956 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9"} err="failed to get container status \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": rpc error: code = NotFound desc = could not find container \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": container with ID starting with 91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9 not found: ID does not exist" Mar 07 21:28:28.054073 master-0 kubenswrapper[16352]: I0307 21:28:28.054012 16352 scope.go:117] "RemoveContainer" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.054616 master-0 kubenswrapper[16352]: I0307 21:28:28.054554 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61"} err="failed to get container status \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": rpc error: code = NotFound desc = could not find container \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": container with ID starting with 3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61 not found: ID does not exist" Mar 07 21:28:28.054616 master-0 kubenswrapper[16352]: I0307 21:28:28.054608 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.055139 master-0 kubenswrapper[16352]: I0307 21:28:28.055088 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} err="failed to get container status \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": rpc error: code = NotFound desc = could not find container \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": container with ID starting with 9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4 not found: ID does not exist" Mar 07 21:28:28.055139 master-0 kubenswrapper[16352]: I0307 21:28:28.055131 16352 scope.go:117] "RemoveContainer" containerID="4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3" Mar 07 21:28:28.055539 master-0 kubenswrapper[16352]: I0307 21:28:28.055483 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3"} err="failed to get container status \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": rpc error: code = NotFound desc = could not find container \"4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3\": container with ID starting with 4536c3db1cf5f6a3812f051382c3bb1d325601836432631af8952d25c8c050e3 not found: ID does not exist" Mar 07 21:28:28.055596 master-0 kubenswrapper[16352]: I0307 21:28:28.055533 16352 scope.go:117] "RemoveContainer" containerID="e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b" Mar 07 21:28:28.056033 master-0 kubenswrapper[16352]: I0307 21:28:28.055994 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b"} err="failed to get container status \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": rpc error: code = NotFound desc = could not find container \"e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b\": container with ID starting with e8eb93b41221cdae93ddb1e158e83643cd346638dbd8fcb640d2b07ca4de590b not found: ID does not exist" Mar 07 21:28:28.056033 master-0 kubenswrapper[16352]: I0307 21:28:28.056026 16352 scope.go:117] "RemoveContainer" containerID="51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a" Mar 07 21:28:28.056447 master-0 kubenswrapper[16352]: I0307 21:28:28.056399 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a"} err="failed to get container status \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": rpc error: code = NotFound desc = could not find container \"51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a\": container with ID starting with 51cd9309c0bece4d3ac259923d171f0d8f7fd1603939ac62d199660ffc995e1a not found: ID does not exist" Mar 07 21:28:28.056447 master-0 kubenswrapper[16352]: I0307 21:28:28.056441 16352 scope.go:117] "RemoveContainer" containerID="91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9" Mar 07 21:28:28.056952 master-0 kubenswrapper[16352]: I0307 21:28:28.056874 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9"} err="failed to get container status \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": rpc error: code = NotFound desc = could not find container \"91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9\": container with ID starting with 91fd59aaf5fc7b70200846bd170dd954dabac47212d9da47374af4c75b5abab9 not found: ID does not exist" Mar 07 21:28:28.057025 master-0 kubenswrapper[16352]: I0307 21:28:28.056945 16352 scope.go:117] "RemoveContainer" containerID="3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61" Mar 07 21:28:28.057515 master-0 kubenswrapper[16352]: I0307 21:28:28.057482 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61"} err="failed to get container status \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": rpc error: code = NotFound desc = could not find container \"3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61\": container with ID starting with 3354d43364e4e478db9ab9669759c7b5f0726730442cf82ef603bba7c5c7dc61 not found: ID does not exist" Mar 07 21:28:28.057515 master-0 kubenswrapper[16352]: I0307 21:28:28.057506 16352 scope.go:117] "RemoveContainer" containerID="9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4" Mar 07 21:28:28.057945 master-0 kubenswrapper[16352]: I0307 21:28:28.057912 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4"} err="failed to get container status \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": rpc error: code = NotFound desc = could not find container \"9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4\": container with ID starting with 9833cafe1dccbaedd50b3699ccd5ee0a32ce02eae75c9a44b29d5870e53248f4 not found: ID does not exist" Mar 07 21:28:28.577131 master-0 kubenswrapper[16352]: I0307 21:28:28.576950 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:28.577131 master-0 kubenswrapper[16352]: I0307 21:28:28.577061 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:28:29.222824 master-0 kubenswrapper[16352]: I0307 21:28:29.220440 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1c6f1e263aa1f0a5ac95d2a74e2c146c" path="/var/lib/kubelet/pods/1c6f1e263aa1f0a5ac95d2a74e2c146c/volumes" Mar 07 21:28:29.311115 master-0 kubenswrapper[16352]: I0307 21:28:29.311050 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:28:29.417745 master-0 kubenswrapper[16352]: I0307 21:28:29.417641 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access\") pod \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " Mar 07 21:28:29.418019 master-0 kubenswrapper[16352]: I0307 21:28:29.417930 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock\") pod \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " Mar 07 21:28:29.418019 master-0 kubenswrapper[16352]: I0307 21:28:29.417985 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir\") pod \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\" (UID: \"5be2b4e6-94a5-4282-ba6f-86dc7634a28d\") " Mar 07 21:28:29.418159 master-0 kubenswrapper[16352]: I0307 21:28:29.418113 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5be2b4e6-94a5-4282-ba6f-86dc7634a28d" (UID: "5be2b4e6-94a5-4282-ba6f-86dc7634a28d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:29.418222 master-0 kubenswrapper[16352]: I0307 21:28:29.418111 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock" (OuterVolumeSpecName: "var-lock") pod "5be2b4e6-94a5-4282-ba6f-86dc7634a28d" (UID: "5be2b4e6-94a5-4282-ba6f-86dc7634a28d"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:29.419163 master-0 kubenswrapper[16352]: I0307 21:28:29.418914 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:29.419163 master-0 kubenswrapper[16352]: I0307 21:28:29.419152 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:29.421582 master-0 kubenswrapper[16352]: I0307 21:28:29.421533 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5be2b4e6-94a5-4282-ba6f-86dc7634a28d" (UID: "5be2b4e6-94a5-4282-ba6f-86dc7634a28d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:29.521412 master-0 kubenswrapper[16352]: I0307 21:28:29.521157 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5be2b4e6-94a5-4282-ba6f-86dc7634a28d-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:29.868313 master-0 kubenswrapper[16352]: I0307 21:28:29.868159 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/installer-5-master-0" event={"ID":"5be2b4e6-94a5-4282-ba6f-86dc7634a28d","Type":"ContainerDied","Data":"56dff1fc3fed95aa37852fd9137401ab4752e0ca9f4ffe1ccf3ee9d5239ec6bd"} Mar 07 21:28:29.868313 master-0 kubenswrapper[16352]: I0307 21:28:29.868240 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/installer-5-master-0" Mar 07 21:28:29.868743 master-0 kubenswrapper[16352]: I0307 21:28:29.868251 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="56dff1fc3fed95aa37852fd9137401ab4752e0ca9f4ffe1ccf3ee9d5239ec6bd" Mar 07 21:28:31.041079 master-0 kubenswrapper[16352]: I0307 21:28:31.040949 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-4-retry-1-master-0"] Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: E0307 21:28:31.041642 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a69242fc-53d6-48f5-82a9-52daf194d047" containerName="installer" Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: I0307 21:28:31.041711 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a69242fc-53d6-48f5-82a9-52daf194d047" containerName="installer" Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: E0307 21:28:31.041758 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5be2b4e6-94a5-4282-ba6f-86dc7634a28d" containerName="installer" Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: I0307 21:28:31.041779 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5be2b4e6-94a5-4282-ba6f-86dc7634a28d" containerName="installer" Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: I0307 21:28:31.042166 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="a69242fc-53d6-48f5-82a9-52daf194d047" containerName="installer" Mar 07 21:28:31.042360 master-0 kubenswrapper[16352]: I0307 21:28:31.042206 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5be2b4e6-94a5-4282-ba6f-86dc7634a28d" containerName="installer" Mar 07 21:28:31.043307 master-0 kubenswrapper[16352]: I0307 21:28:31.043240 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.050778 master-0 kubenswrapper[16352]: I0307 21:28:31.050666 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-v4v2q" Mar 07 21:28:31.051100 master-0 kubenswrapper[16352]: I0307 21:28:31.050786 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 07 21:28:31.060129 master-0 kubenswrapper[16352]: I0307 21:28:31.060022 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-retry-1-master-0"] Mar 07 21:28:31.154974 master-0 kubenswrapper[16352]: I0307 21:28:31.154881 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.155437 master-0 kubenswrapper[16352]: I0307 21:28:31.155242 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.155509 master-0 kubenswrapper[16352]: I0307 21:28:31.155414 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.256469 master-0 kubenswrapper[16352]: I0307 21:28:31.256344 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.256877 master-0 kubenswrapper[16352]: I0307 21:28:31.256554 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.256877 master-0 kubenswrapper[16352]: I0307 21:28:31.256574 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.256877 master-0 kubenswrapper[16352]: I0307 21:28:31.256601 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.257221 master-0 kubenswrapper[16352]: I0307 21:28:31.256917 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.288375 master-0 kubenswrapper[16352]: I0307 21:28:31.288257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access\") pod \"installer-4-retry-1-master-0\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.381734 master-0 kubenswrapper[16352]: I0307 21:28:31.381596 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:28:31.889161 master-0 kubenswrapper[16352]: I0307 21:28:31.889080 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-4-retry-1-master-0"] Mar 07 21:28:32.903273 master-0 kubenswrapper[16352]: I0307 21:28:32.903176 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" event={"ID":"3c0f8565-4b6b-42b2-835c-035812f033f6","Type":"ContainerStarted","Data":"310c39c43cc82b035920d5b5d9e447bbaeb406cde5e520ceca5c08491b290b98"} Mar 07 21:28:32.903273 master-0 kubenswrapper[16352]: I0307 21:28:32.903264 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" event={"ID":"3c0f8565-4b6b-42b2-835c-035812f033f6","Type":"ContainerStarted","Data":"9833cf5ba6562b12539da206023b4cf850a62bf86c383c387a04f2bfbcefe313"} Mar 07 21:28:32.929529 master-0 kubenswrapper[16352]: I0307 21:28:32.929404 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" podStartSLOduration=1.92937638 podStartE2EDuration="1.92937638s" podCreationTimestamp="2026-03-07 21:28:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:28:32.923830672 +0000 UTC m=+635.994535761" watchObservedRunningTime="2026-03-07 21:28:32.92937638 +0000 UTC m=+636.000081459" Mar 07 21:28:38.576898 master-0 kubenswrapper[16352]: I0307 21:28:38.576765 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:38.578123 master-0 kubenswrapper[16352]: I0307 21:28:38.576897 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:28:41.188987 master-0 kubenswrapper[16352]: I0307 21:28:41.188868 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:41.190029 master-0 kubenswrapper[16352]: I0307 21:28:41.188997 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:41.220796 master-0 kubenswrapper[16352]: I0307 21:28:41.220655 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7d1e2356-8a3c-489a-816d-c2934aeaabc7" Mar 07 21:28:41.220796 master-0 kubenswrapper[16352]: I0307 21:28:41.220753 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podUID="7d1e2356-8a3c-489a-816d-c2934aeaabc7" Mar 07 21:28:41.240712 master-0 kubenswrapper[16352]: I0307 21:28:41.240564 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="32ef53c8-da77-469b-a0da-85cfc266e297" Mar 07 21:28:41.240712 master-0 kubenswrapper[16352]: I0307 21:28:41.240639 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podUID="32ef53c8-da77-469b-a0da-85cfc266e297" Mar 07 21:28:41.247843 master-0 kubenswrapper[16352]: I0307 21:28:41.245795 16352 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:41.263840 master-0 kubenswrapper[16352]: I0307 21:28:41.263286 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:28:41.264205 master-0 kubenswrapper[16352]: I0307 21:28:41.264021 16352 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:41.270748 master-0 kubenswrapper[16352]: I0307 21:28:41.268604 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:41.278968 master-0 kubenswrapper[16352]: I0307 21:28:41.278884 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:28:41.295143 master-0 kubenswrapper[16352]: I0307 21:28:41.295043 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:28:41.311859 master-0 kubenswrapper[16352]: I0307 21:28:41.308785 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:28:41.318486 master-0 kubenswrapper[16352]: I0307 21:28:41.318420 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-master-0"] Mar 07 21:28:41.326072 master-0 kubenswrapper[16352]: W0307 21:28:41.325988 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0603fd6b1c0835ce0492441d0e22b91c.slice/crio-b3e3aa632491addccae0de9b388096a7e7f8c3e1dd43734ce025405a03a3e8d2 WatchSource:0}: Error finding container b3e3aa632491addccae0de9b388096a7e7f8c3e1dd43734ce025405a03a3e8d2: Status 404 returned error can't find the container with id b3e3aa632491addccae0de9b388096a7e7f8c3e1dd43734ce025405a03a3e8d2 Mar 07 21:28:41.366424 master-0 kubenswrapper[16352]: I0307 21:28:41.366361 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:41.373034 master-0 kubenswrapper[16352]: I0307 21:28:41.372997 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-master-0"] Mar 07 21:28:41.410841 master-0 kubenswrapper[16352]: W0307 21:28:41.410746 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1453f6461bf5d599ad65a4656343ee91.slice/crio-abc371cb4e41a15c5fbcb329551de8ae76adfb8568e6c0df2129afcfc1958b60 WatchSource:0}: Error finding container abc371cb4e41a15c5fbcb329551de8ae76adfb8568e6c0df2129afcfc1958b60: Status 404 returned error can't find the container with id abc371cb4e41a15c5fbcb329551de8ae76adfb8568e6c0df2129afcfc1958b60 Mar 07 21:28:41.999206 master-0 kubenswrapper[16352]: I0307 21:28:41.999036 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"296ccd1f5f903abbfb7cd2fe7c0feb2ff12c84a4db4680cc5e00f2a090a41f59"} Mar 07 21:28:41.999206 master-0 kubenswrapper[16352]: I0307 21:28:41.999116 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"61065fbb8706e680fd9d4b55e3fa79db6ff07a75715966e4571013cc2b0ff419"} Mar 07 21:28:41.999206 master-0 kubenswrapper[16352]: I0307 21:28:41.999129 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"b3e3aa632491addccae0de9b388096a7e7f8c3e1dd43734ce025405a03a3e8d2"} Mar 07 21:28:42.003176 master-0 kubenswrapper[16352]: I0307 21:28:42.003124 16352 generic.go:334] "Generic (PLEG): container finished" podID="1453f6461bf5d599ad65a4656343ee91" containerID="131501fda3625f6ad52d2bdeadea4d328e22f6cb927fdbbc996503e024e35b78" exitCode=0 Mar 07 21:28:42.003281 master-0 kubenswrapper[16352]: I0307 21:28:42.003202 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerDied","Data":"131501fda3625f6ad52d2bdeadea4d328e22f6cb927fdbbc996503e024e35b78"} Mar 07 21:28:42.003281 master-0 kubenswrapper[16352]: I0307 21:28:42.003271 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"abc371cb4e41a15c5fbcb329551de8ae76adfb8568e6c0df2129afcfc1958b60"} Mar 07 21:28:43.018864 master-0 kubenswrapper[16352]: I0307 21:28:43.018785 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"b546e2e6d5d6577d76b6e2e6c0e1cb462d959ed7919cbfc3ad836f4e8d6a51fb"} Mar 07 21:28:43.018864 master-0 kubenswrapper[16352]: I0307 21:28:43.018849 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"dc16ee5dd6d2af54dae2117604026a8d0077086433f7014f7fd53a605991a3d8"} Mar 07 21:28:43.024516 master-0 kubenswrapper[16352]: I0307 21:28:43.024459 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"ebba65d706658502e13236655dd60deedaea494ba51f89af58f569b3ab87fb65"} Mar 07 21:28:43.024635 master-0 kubenswrapper[16352]: I0307 21:28:43.024524 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"b4550ab81c413719c70d090e53bf86b1fd7b544d0fd1f2f3919a6c2a4c81bcc4"} Mar 07 21:28:43.024635 master-0 kubenswrapper[16352]: I0307 21:28:43.024545 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" event={"ID":"1453f6461bf5d599ad65a4656343ee91","Type":"ContainerStarted","Data":"09a378303b7b9839e7131867536853e6fc79b6774518a008a68a5ac0f6f1c856"} Mar 07 21:28:43.024635 master-0 kubenswrapper[16352]: I0307 21:28:43.024622 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:28:43.047603 master-0 kubenswrapper[16352]: I0307 21:28:43.047361 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" podStartSLOduration=2.047318032 podStartE2EDuration="2.047318032s" podCreationTimestamp="2026-03-07 21:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:28:43.045248581 +0000 UTC m=+646.115953640" watchObservedRunningTime="2026-03-07 21:28:43.047318032 +0000 UTC m=+646.118023091" Mar 07 21:28:43.069455 master-0 kubenswrapper[16352]: I0307 21:28:43.069335 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" podStartSLOduration=2.069307729 podStartE2EDuration="2.069307729s" podCreationTimestamp="2026-03-07 21:28:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:28:43.064055238 +0000 UTC m=+646.134760377" watchObservedRunningTime="2026-03-07 21:28:43.069307729 +0000 UTC m=+646.140012798" Mar 07 21:28:44.994399 master-0 kubenswrapper[16352]: I0307 21:28:44.994242 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" podUID="86a3c1de-a810-4b48-be89-1b05da316a28" containerName="oauth-openshift" containerID="cri-o://50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f" gracePeriod=15 Mar 07 21:28:45.569511 master-0 kubenswrapper[16352]: I0307 21:28:45.569446 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:28:45.641214 master-0 kubenswrapper[16352]: I0307 21:28:45.641141 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641214 master-0 kubenswrapper[16352]: I0307 21:28:45.641222 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641278 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641307 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641336 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641398 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641455 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.641649 master-0 kubenswrapper[16352]: I0307 21:28:45.641311 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:28:45.642196 master-0 kubenswrapper[16352]: I0307 21:28:45.642156 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.642253 master-0 kubenswrapper[16352]: I0307 21:28:45.642199 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml8xn\" (UniqueName: \"kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.642253 master-0 kubenswrapper[16352]: I0307 21:28:45.642238 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.642330 master-0 kubenswrapper[16352]: I0307 21:28:45.642274 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.642402 master-0 kubenswrapper[16352]: I0307 21:28:45.642280 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:45.642477 master-0 kubenswrapper[16352]: I0307 21:28:45.642273 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:45.642477 master-0 kubenswrapper[16352]: I0307 21:28:45.642326 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.642690 master-0 kubenswrapper[16352]: I0307 21:28:45.642635 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca\") pod \"86a3c1de-a810-4b48-be89-1b05da316a28\" (UID: \"86a3c1de-a810-4b48-be89-1b05da316a28\") " Mar 07 21:28:45.643154 master-0 kubenswrapper[16352]: I0307 21:28:45.643109 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:45.644071 master-0 kubenswrapper[16352]: I0307 21:28:45.644000 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.644071 master-0 kubenswrapper[16352]: I0307 21:28:45.644063 16352 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/86a3c1de-a810-4b48-be89-1b05da316a28-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.644170 master-0 kubenswrapper[16352]: I0307 21:28:45.644083 16352 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-audit-policies\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.645708 master-0 kubenswrapper[16352]: I0307 21:28:45.645112 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.645708 master-0 kubenswrapper[16352]: I0307 21:28:45.644035 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:28:45.650741 master-0 kubenswrapper[16352]: I0307 21:28:45.646007 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.650741 master-0 kubenswrapper[16352]: I0307 21:28:45.646057 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.650741 master-0 kubenswrapper[16352]: I0307 21:28:45.647041 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn" (OuterVolumeSpecName: "kube-api-access-ml8xn") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "kube-api-access-ml8xn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:28:45.653204 master-0 kubenswrapper[16352]: I0307 21:28:45.653127 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.659344 master-0 kubenswrapper[16352]: I0307 21:28:45.659244 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.659454 master-0 kubenswrapper[16352]: I0307 21:28:45.659324 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.659454 master-0 kubenswrapper[16352]: I0307 21:28:45.659367 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.659561 master-0 kubenswrapper[16352]: I0307 21:28:45.659465 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "86a3c1de-a810-4b48-be89-1b05da316a28" (UID: "86a3c1de-a810-4b48-be89-1b05da316a28"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:28:45.747127 master-0 kubenswrapper[16352]: I0307 21:28:45.747016 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-provider-selection\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747127 master-0 kubenswrapper[16352]: I0307 21:28:45.747102 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ml8xn\" (UniqueName: \"kubernetes.io/projected/86a3c1de-a810-4b48-be89-1b05da316a28-kube-api-access-ml8xn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747127 master-0 kubenswrapper[16352]: I0307 21:28:45.747120 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-session\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747127 master-0 kubenswrapper[16352]: I0307 21:28:45.747133 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-cliconfig\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747127 master-0 kubenswrapper[16352]: I0307 21:28:45.747147 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-router-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747583 master-0 kubenswrapper[16352]: I0307 21:28:45.747160 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-login\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747583 master-0 kubenswrapper[16352]: I0307 21:28:45.747173 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-ocp-branding-template\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747583 master-0 kubenswrapper[16352]: I0307 21:28:45.747187 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-system-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:45.747583 master-0 kubenswrapper[16352]: I0307 21:28:45.747200 16352 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/86a3c1de-a810-4b48-be89-1b05da316a28-v4-0-config-user-template-error\") on node \"master-0\" DevicePath \"\"" Mar 07 21:28:46.058516 master-0 kubenswrapper[16352]: I0307 21:28:46.058396 16352 generic.go:334] "Generic (PLEG): container finished" podID="86a3c1de-a810-4b48-be89-1b05da316a28" containerID="50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f" exitCode=0 Mar 07 21:28:46.058516 master-0 kubenswrapper[16352]: I0307 21:28:46.058509 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" event={"ID":"86a3c1de-a810-4b48-be89-1b05da316a28","Type":"ContainerDied","Data":"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f"} Mar 07 21:28:46.059377 master-0 kubenswrapper[16352]: I0307 21:28:46.058559 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" Mar 07 21:28:46.059377 master-0 kubenswrapper[16352]: I0307 21:28:46.058589 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-698d9d45c9-5wh7z" event={"ID":"86a3c1de-a810-4b48-be89-1b05da316a28","Type":"ContainerDied","Data":"1efe282d92b3c51f1f47e2abc1e2f6213386af1e76b2de2bb7c7c90e7f88b389"} Mar 07 21:28:46.059377 master-0 kubenswrapper[16352]: I0307 21:28:46.058611 16352 scope.go:117] "RemoveContainer" containerID="50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f" Mar 07 21:28:46.079068 master-0 kubenswrapper[16352]: I0307 21:28:46.078997 16352 scope.go:117] "RemoveContainer" containerID="50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f" Mar 07 21:28:46.079752 master-0 kubenswrapper[16352]: E0307 21:28:46.079669 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f\": container with ID starting with 50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f not found: ID does not exist" containerID="50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f" Mar 07 21:28:46.079828 master-0 kubenswrapper[16352]: I0307 21:28:46.079759 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f"} err="failed to get container status \"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f\": rpc error: code = NotFound desc = could not find container \"50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f\": container with ID starting with 50bd7de45482b4cdaa717aff30cb040df5bca6e7209b9754a6ef8a4acdc49b2f not found: ID does not exist" Mar 07 21:28:46.108321 master-0 kubenswrapper[16352]: I0307 21:28:46.108218 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:28:46.114541 master-0 kubenswrapper[16352]: I0307 21:28:46.114469 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-698d9d45c9-5wh7z"] Mar 07 21:28:47.205313 master-0 kubenswrapper[16352]: I0307 21:28:47.205202 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="86a3c1de-a810-4b48-be89-1b05da316a28" path="/var/lib/kubelet/pods/86a3c1de-a810-4b48-be89-1b05da316a28/volumes" Mar 07 21:28:48.577146 master-0 kubenswrapper[16352]: I0307 21:28:48.577064 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:48.578109 master-0 kubenswrapper[16352]: I0307 21:28:48.577986 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:28:51.270196 master-0 kubenswrapper[16352]: I0307 21:28:51.270102 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:51.270196 master-0 kubenswrapper[16352]: I0307 21:28:51.270208 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:51.271288 master-0 kubenswrapper[16352]: I0307 21:28:51.270231 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:51.271288 master-0 kubenswrapper[16352]: I0307 21:28:51.270257 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:51.279197 master-0 kubenswrapper[16352]: I0307 21:28:51.279139 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:51.280783 master-0 kubenswrapper[16352]: I0307 21:28:51.280552 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:52.134965 master-0 kubenswrapper[16352]: I0307 21:28:52.134880 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:52.134965 master-0 kubenswrapper[16352]: I0307 21:28:52.134975 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:28:58.577515 master-0 kubenswrapper[16352]: I0307 21:28:58.577412 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:28:58.577515 master-0 kubenswrapper[16352]: I0307 21:28:58.577508 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:02.842800 master-0 kubenswrapper[16352]: I0307 21:29:02.842726 16352 scope.go:117] "RemoveContainer" containerID="801c0e6645c48e23de0745ca7de89bfebe070d2e4b76a9fdb72366ccb7b3154a" Mar 07 21:29:02.859248 master-0 kubenswrapper[16352]: I0307 21:29:02.859161 16352 scope.go:117] "RemoveContainer" containerID="95f51830d903c41c8ee7ab7a8d7de2c678711e9e11302d94e3d2db00f6dd7437" Mar 07 21:29:02.874406 master-0 kubenswrapper[16352]: I0307 21:29:02.874361 16352 scope.go:117] "RemoveContainer" containerID="29fe93c228bd77fd76218d416ea847bb6245a51f2342d94714a2572a13bb2ff1" Mar 07 21:29:08.577800 master-0 kubenswrapper[16352]: I0307 21:29:08.577721 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:08.578994 master-0 kubenswrapper[16352]: I0307 21:29:08.578872 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:10.458935 master-0 kubenswrapper[16352]: I0307 21:29:10.458790 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:29:10.459803 master-0 kubenswrapper[16352]: E0307 21:29:10.459449 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="86a3c1de-a810-4b48-be89-1b05da316a28" containerName="oauth-openshift" Mar 07 21:29:10.459803 master-0 kubenswrapper[16352]: I0307 21:29:10.459480 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="86a3c1de-a810-4b48-be89-1b05da316a28" containerName="oauth-openshift" Mar 07 21:29:10.459921 master-0 kubenswrapper[16352]: I0307 21:29:10.459846 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="86a3c1de-a810-4b48-be89-1b05da316a28" containerName="oauth-openshift" Mar 07 21:29:10.460715 master-0 kubenswrapper[16352]: I0307 21:29:10.460639 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.511916 master-0 kubenswrapper[16352]: I0307 21:29:10.511802 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:29:10.512335 master-0 kubenswrapper[16352]: I0307 21:29:10.512281 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" containerID="cri-o://360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5" gracePeriod=15 Mar 07 21:29:10.512537 master-0 kubenswrapper[16352]: I0307 21:29:10.512365 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" containerID="cri-o://3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d" gracePeriod=15 Mar 07 21:29:10.512537 master-0 kubenswrapper[16352]: I0307 21:29:10.512486 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166" gracePeriod=15 Mar 07 21:29:10.512889 master-0 kubenswrapper[16352]: I0307 21:29:10.512486 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4" gracePeriod=15 Mar 07 21:29:10.512889 master-0 kubenswrapper[16352]: I0307 21:29:10.512485 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" containerID="cri-o://5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425" gracePeriod=15 Mar 07 21:29:10.515564 master-0 kubenswrapper[16352]: I0307 21:29:10.515485 16352 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:29:10.516095 master-0 kubenswrapper[16352]: E0307 21:29:10.516051 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 07 21:29:10.516095 master-0 kubenswrapper[16352]: I0307 21:29:10.516085 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: E0307 21:29:10.516123 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: I0307 21:29:10.516139 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: E0307 21:29:10.516162 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: I0307 21:29:10.516175 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: E0307 21:29:10.516204 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: I0307 21:29:10.516216 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: E0307 21:29:10.516229 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: I0307 21:29:10.516241 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 21:29:10.516263 master-0 kubenswrapper[16352]: E0307 21:29:10.516278 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516293 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516516 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516547 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-check-endpoints" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516579 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-regeneration-controller" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516610 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="setup" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516634 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-insecure-readyz" Mar 07 21:29:10.516820 master-0 kubenswrapper[16352]: I0307 21:29:10.516665 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerName="kube-apiserver-cert-syncer" Mar 07 21:29:10.535143 master-0 kubenswrapper[16352]: I0307 21:29:10.534016 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:29:10.550892 master-0 kubenswrapper[16352]: I0307 21:29:10.550803 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.550892 master-0 kubenswrapper[16352]: I0307 21:29:10.550891 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.551319 master-0 kubenswrapper[16352]: I0307 21:29:10.551252 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.551510 master-0 kubenswrapper[16352]: I0307 21:29:10.551445 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.551575 master-0 kubenswrapper[16352]: I0307 21:29:10.551528 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654630 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654741 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654769 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654806 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654835 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654889 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654920 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.654966 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.655082 master-0 kubenswrapper[16352]: I0307 21:29:10.655070 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.657751 master-0 kubenswrapper[16352]: I0307 21:29:10.655136 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.657751 master-0 kubenswrapper[16352]: I0307 21:29:10.655167 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.657751 master-0 kubenswrapper[16352]: I0307 21:29:10.655196 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.657751 master-0 kubenswrapper[16352]: I0307 21:29:10.655232 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-master-0\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.756958 master-0 kubenswrapper[16352]: I0307 21:29:10.756785 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.757204 master-0 kubenswrapper[16352]: I0307 21:29:10.757047 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.757204 master-0 kubenswrapper[16352]: I0307 21:29:10.757047 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-audit-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.757204 master-0 kubenswrapper[16352]: I0307 21:29:10.757109 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.758367 master-0 kubenswrapper[16352]: I0307 21:29:10.757497 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-cert-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.758367 master-0 kubenswrapper[16352]: I0307 21:29:10.757589 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/48512e02022680c9d90092634f0fc146-resource-dir\") pod \"kube-apiserver-master-0\" (UID: \"48512e02022680c9d90092634f0fc146\") " pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:10.825136 master-0 kubenswrapper[16352]: I0307 21:29:10.825002 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:29:10.866633 master-0 kubenswrapper[16352]: W0307 21:29:10.866484 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a18cac8a90d6913a6a0391d805cddc9.slice/crio-a06ee63ce1e4f1eba8320486cf206044cf262532035daeede8d86146a80d5cc0 WatchSource:0}: Error finding container a06ee63ce1e4f1eba8320486cf206044cf262532035daeede8d86146a80d5cc0: Status 404 returned error can't find the container with id a06ee63ce1e4f1eba8320486cf206044cf262532035daeede8d86146a80d5cc0 Mar 07 21:29:10.871555 master-0 kubenswrapper[16352]: E0307 21:29:10.871357 16352 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189aac647a27fee2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:3a18cac8a90d6913a6a0391d805cddc9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:29:10.869860066 +0000 UTC m=+673.940565165,LastTimestamp:2026-03-07 21:29:10.869860066 +0000 UTC m=+673.940565165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:29:11.342518 master-0 kubenswrapper[16352]: I0307 21:29:11.342077 16352 generic.go:334] "Generic (PLEG): container finished" podID="3c0f8565-4b6b-42b2-835c-035812f033f6" containerID="310c39c43cc82b035920d5b5d9e447bbaeb406cde5e520ceca5c08491b290b98" exitCode=0 Mar 07 21:29:11.342518 master-0 kubenswrapper[16352]: I0307 21:29:11.342182 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" event={"ID":"3c0f8565-4b6b-42b2-835c-035812f033f6","Type":"ContainerDied","Data":"310c39c43cc82b035920d5b5d9e447bbaeb406cde5e520ceca5c08491b290b98"} Mar 07 21:29:11.344248 master-0 kubenswrapper[16352]: I0307 21:29:11.344192 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:11.345129 master-0 kubenswrapper[16352]: I0307 21:29:11.345055 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:11.345405 master-0 kubenswrapper[16352]: I0307 21:29:11.345366 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"3a18cac8a90d6913a6a0391d805cddc9","Type":"ContainerStarted","Data":"3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf"} Mar 07 21:29:11.345485 master-0 kubenswrapper[16352]: I0307 21:29:11.345415 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" event={"ID":"3a18cac8a90d6913a6a0391d805cddc9","Type":"ContainerStarted","Data":"a06ee63ce1e4f1eba8320486cf206044cf262532035daeede8d86146a80d5cc0"} Mar 07 21:29:11.347378 master-0 kubenswrapper[16352]: I0307 21:29:11.347305 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:11.348969 master-0 kubenswrapper[16352]: I0307 21:29:11.348881 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:11.350262 master-0 kubenswrapper[16352]: I0307 21:29:11.350189 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 07 21:29:11.351310 master-0 kubenswrapper[16352]: I0307 21:29:11.351198 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d" exitCode=0 Mar 07 21:29:11.351310 master-0 kubenswrapper[16352]: I0307 21:29:11.351238 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166" exitCode=0 Mar 07 21:29:11.351310 master-0 kubenswrapper[16352]: I0307 21:29:11.351254 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4" exitCode=0 Mar 07 21:29:11.351310 master-0 kubenswrapper[16352]: I0307 21:29:11.351267 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425" exitCode=2 Mar 07 21:29:12.995867 master-0 kubenswrapper[16352]: I0307 21:29:12.995751 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:29:12.997488 master-0 kubenswrapper[16352]: I0307 21:29:12.997387 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:12.998129 master-0 kubenswrapper[16352]: I0307 21:29:12.998074 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir\") pod \"3c0f8565-4b6b-42b2-835c-035812f033f6\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " Mar 07 21:29:12.998248 master-0 kubenswrapper[16352]: I0307 21:29:12.998152 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock\") pod \"3c0f8565-4b6b-42b2-835c-035812f033f6\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " Mar 07 21:29:12.998473 master-0 kubenswrapper[16352]: I0307 21:29:12.998421 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "3c0f8565-4b6b-42b2-835c-035812f033f6" (UID: "3c0f8565-4b6b-42b2-835c-035812f033f6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:29:12.998716 master-0 kubenswrapper[16352]: I0307 21:29:12.998430 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock" (OuterVolumeSpecName: "var-lock") pod "3c0f8565-4b6b-42b2-835c-035812f033f6" (UID: "3c0f8565-4b6b-42b2-835c-035812f033f6"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:29:12.998926 master-0 kubenswrapper[16352]: I0307 21:29:12.998564 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.005624 master-0 kubenswrapper[16352]: I0307 21:29:13.005550 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 07 21:29:13.008508 master-0 kubenswrapper[16352]: I0307 21:29:13.007017 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:13.008508 master-0 kubenswrapper[16352]: I0307 21:29:13.008233 16352 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.009024 master-0 kubenswrapper[16352]: I0307 21:29:13.008977 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.010083 master-0 kubenswrapper[16352]: I0307 21:29:13.009954 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.100590 master-0 kubenswrapper[16352]: I0307 21:29:13.100526 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 07 21:29:13.101103 master-0 kubenswrapper[16352]: I0307 21:29:13.101060 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 07 21:29:13.101417 master-0 kubenswrapper[16352]: I0307 21:29:13.100752 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:29:13.101521 master-0 kubenswrapper[16352]: I0307 21:29:13.101157 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:29:13.101521 master-0 kubenswrapper[16352]: I0307 21:29:13.101388 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") pod \"cdcecc61ff5eeb08bd2a3ac12599e4f9\" (UID: \"cdcecc61ff5eeb08bd2a3ac12599e4f9\") " Mar 07 21:29:13.101724 master-0 kubenswrapper[16352]: I0307 21:29:13.101653 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access\") pod \"3c0f8565-4b6b-42b2-835c-035812f033f6\" (UID: \"3c0f8565-4b6b-42b2-835c-035812f033f6\") " Mar 07 21:29:13.101823 master-0 kubenswrapper[16352]: I0307 21:29:13.101712 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "cdcecc61ff5eeb08bd2a3ac12599e4f9" (UID: "cdcecc61ff5eeb08bd2a3ac12599e4f9"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:29:13.102223 master-0 kubenswrapper[16352]: I0307 21:29:13.102164 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.102223 master-0 kubenswrapper[16352]: I0307 21:29:13.102205 16352 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-audit-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.102223 master-0 kubenswrapper[16352]: I0307 21:29:13.102225 16352 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/cdcecc61ff5eeb08bd2a3ac12599e4f9-cert-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.102453 master-0 kubenswrapper[16352]: I0307 21:29:13.102245 16352 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-kubelet-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.102453 master-0 kubenswrapper[16352]: I0307 21:29:13.102316 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3c0f8565-4b6b-42b2-835c-035812f033f6-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.105181 master-0 kubenswrapper[16352]: I0307 21:29:13.105118 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "3c0f8565-4b6b-42b2-835c-035812f033f6" (UID: "3c0f8565-4b6b-42b2-835c-035812f033f6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:29:13.203897 master-0 kubenswrapper[16352]: I0307 21:29:13.203734 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/3c0f8565-4b6b-42b2-835c-035812f033f6-kube-api-access\") on node \"master-0\" DevicePath \"\"" Mar 07 21:29:13.203897 master-0 kubenswrapper[16352]: I0307 21:29:13.203755 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" path="/var/lib/kubelet/pods/cdcecc61ff5eeb08bd2a3ac12599e4f9/volumes" Mar 07 21:29:13.390920 master-0 kubenswrapper[16352]: I0307 21:29:13.390821 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" event={"ID":"3c0f8565-4b6b-42b2-835c-035812f033f6","Type":"ContainerDied","Data":"9833cf5ba6562b12539da206023b4cf850a62bf86c383c387a04f2bfbcefe313"} Mar 07 21:29:13.390920 master-0 kubenswrapper[16352]: I0307 21:29:13.390909 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9833cf5ba6562b12539da206023b4cf850a62bf86c383c387a04f2bfbcefe313" Mar 07 21:29:13.391295 master-0 kubenswrapper[16352]: I0307 21:29:13.391021 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" Mar 07 21:29:13.397761 master-0 kubenswrapper[16352]: I0307 21:29:13.397674 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-master-0_cdcecc61ff5eeb08bd2a3ac12599e4f9/kube-apiserver-cert-syncer/0.log" Mar 07 21:29:13.399140 master-0 kubenswrapper[16352]: I0307 21:29:13.399018 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.399266 master-0 kubenswrapper[16352]: I0307 21:29:13.399130 16352 generic.go:334] "Generic (PLEG): container finished" podID="cdcecc61ff5eeb08bd2a3ac12599e4f9" containerID="360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5" exitCode=0 Mar 07 21:29:13.399266 master-0 kubenswrapper[16352]: I0307 21:29:13.399222 16352 scope.go:117] "RemoveContainer" containerID="3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d" Mar 07 21:29:13.399607 master-0 kubenswrapper[16352]: I0307 21:29:13.399570 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:13.400387 master-0 kubenswrapper[16352]: I0307 21:29:13.400318 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.401440 master-0 kubenswrapper[16352]: I0307 21:29:13.401383 16352 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.402443 master-0 kubenswrapper[16352]: I0307 21:29:13.402356 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.403368 master-0 kubenswrapper[16352]: I0307 21:29:13.403295 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.406647 master-0 kubenswrapper[16352]: I0307 21:29:13.406563 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.407778 master-0 kubenswrapper[16352]: I0307 21:29:13.407644 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.408989 master-0 kubenswrapper[16352]: I0307 21:29:13.408883 16352 status_manager.go:851] "Failed to get status for pod" podUID="cdcecc61ff5eeb08bd2a3ac12599e4f9" pod="openshift-kube-apiserver/kube-apiserver-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:13.430926 master-0 kubenswrapper[16352]: I0307 21:29:13.430858 16352 scope.go:117] "RemoveContainer" containerID="8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166" Mar 07 21:29:13.457033 master-0 kubenswrapper[16352]: I0307 21:29:13.456961 16352 scope.go:117] "RemoveContainer" containerID="cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4" Mar 07 21:29:13.482289 master-0 kubenswrapper[16352]: I0307 21:29:13.482220 16352 scope.go:117] "RemoveContainer" containerID="5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425" Mar 07 21:29:13.516934 master-0 kubenswrapper[16352]: I0307 21:29:13.516414 16352 scope.go:117] "RemoveContainer" containerID="360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5" Mar 07 21:29:13.544903 master-0 kubenswrapper[16352]: I0307 21:29:13.544779 16352 scope.go:117] "RemoveContainer" containerID="5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612" Mar 07 21:29:13.568755 master-0 kubenswrapper[16352]: I0307 21:29:13.568595 16352 scope.go:117] "RemoveContainer" containerID="3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d" Mar 07 21:29:13.569417 master-0 kubenswrapper[16352]: E0307 21:29:13.569367 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d\": container with ID starting with 3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d not found: ID does not exist" containerID="3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d" Mar 07 21:29:13.569535 master-0 kubenswrapper[16352]: I0307 21:29:13.569429 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d"} err="failed to get container status \"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d\": rpc error: code = NotFound desc = could not find container \"3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d\": container with ID starting with 3bd730e027cc7082d8ea92e33970ddac2dee19d2bd4f560af69c08f0ddd2cc1d not found: ID does not exist" Mar 07 21:29:13.569535 master-0 kubenswrapper[16352]: I0307 21:29:13.569468 16352 scope.go:117] "RemoveContainer" containerID="8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166" Mar 07 21:29:13.570544 master-0 kubenswrapper[16352]: E0307 21:29:13.570227 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166\": container with ID starting with 8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166 not found: ID does not exist" containerID="8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166" Mar 07 21:29:13.570544 master-0 kubenswrapper[16352]: I0307 21:29:13.570312 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166"} err="failed to get container status \"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166\": rpc error: code = NotFound desc = could not find container \"8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166\": container with ID starting with 8b00ca370b3b3cdf137701e1b71d11fa91aaa0f3c26e684c8ad4f993772d8166 not found: ID does not exist" Mar 07 21:29:13.570544 master-0 kubenswrapper[16352]: I0307 21:29:13.570371 16352 scope.go:117] "RemoveContainer" containerID="cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4" Mar 07 21:29:13.571322 master-0 kubenswrapper[16352]: E0307 21:29:13.571072 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4\": container with ID starting with cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4 not found: ID does not exist" containerID="cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4" Mar 07 21:29:13.571322 master-0 kubenswrapper[16352]: I0307 21:29:13.571127 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4"} err="failed to get container status \"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4\": rpc error: code = NotFound desc = could not find container \"cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4\": container with ID starting with cc76bd79497244017a54b770e92dc62f5300c66a8a969aa6eae94b30ac97e2b4 not found: ID does not exist" Mar 07 21:29:13.571322 master-0 kubenswrapper[16352]: I0307 21:29:13.571159 16352 scope.go:117] "RemoveContainer" containerID="5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425" Mar 07 21:29:13.571553 master-0 kubenswrapper[16352]: E0307 21:29:13.571467 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425\": container with ID starting with 5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425 not found: ID does not exist" containerID="5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425" Mar 07 21:29:13.571553 master-0 kubenswrapper[16352]: I0307 21:29:13.571518 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425"} err="failed to get container status \"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425\": rpc error: code = NotFound desc = could not find container \"5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425\": container with ID starting with 5d77b973851cf5c8c87a843dd237de4d267325aedad967b11ad01f8332190425 not found: ID does not exist" Mar 07 21:29:13.571553 master-0 kubenswrapper[16352]: I0307 21:29:13.571534 16352 scope.go:117] "RemoveContainer" containerID="360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5" Mar 07 21:29:13.572193 master-0 kubenswrapper[16352]: E0307 21:29:13.572126 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5\": container with ID starting with 360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5 not found: ID does not exist" containerID="360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5" Mar 07 21:29:13.572296 master-0 kubenswrapper[16352]: I0307 21:29:13.572184 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5"} err="failed to get container status \"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5\": rpc error: code = NotFound desc = could not find container \"360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5\": container with ID starting with 360d3026f523e90afbe9b1291f5a43bc9a963880318ae24cb6b10127f7962bb5 not found: ID does not exist" Mar 07 21:29:13.572296 master-0 kubenswrapper[16352]: I0307 21:29:13.572243 16352 scope.go:117] "RemoveContainer" containerID="5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612" Mar 07 21:29:13.573362 master-0 kubenswrapper[16352]: E0307 21:29:13.572608 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612\": container with ID starting with 5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612 not found: ID does not exist" containerID="5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612" Mar 07 21:29:13.573362 master-0 kubenswrapper[16352]: I0307 21:29:13.572634 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612"} err="failed to get container status \"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612\": rpc error: code = NotFound desc = could not find container \"5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612\": container with ID starting with 5b03787d7a4301edd1b0b230569109d272f0c3d6d22927755493f6d222f88612 not found: ID does not exist" Mar 07 21:29:17.201432 master-0 kubenswrapper[16352]: I0307 21:29:17.201333 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:17.202285 master-0 kubenswrapper[16352]: I0307 21:29:17.202198 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:18.576582 master-0 kubenswrapper[16352]: I0307 21:29:18.576486 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:18.577664 master-0 kubenswrapper[16352]: I0307 21:29:18.576594 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:20.047795 master-0 kubenswrapper[16352]: E0307 21:29:20.047669 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:20.049235 master-0 kubenswrapper[16352]: E0307 21:29:20.049176 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:20.050618 master-0 kubenswrapper[16352]: E0307 21:29:20.050553 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:20.051801 master-0 kubenswrapper[16352]: E0307 21:29:20.051733 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:20.052882 master-0 kubenswrapper[16352]: E0307 21:29:20.052743 16352 controller.go:195] "Failed to update lease" err="Put \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:20.052882 master-0 kubenswrapper[16352]: I0307 21:29:20.052880 16352 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 07 21:29:20.053898 master-0 kubenswrapper[16352]: E0307 21:29:20.053843 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="200ms" Mar 07 21:29:20.255530 master-0 kubenswrapper[16352]: E0307 21:29:20.255437 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="400ms" Mar 07 21:29:20.581842 master-0 kubenswrapper[16352]: E0307 21:29:20.581638 16352 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 192.168.32.10:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-master-0.189aac647a27fee2 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-master-0,UID:3a18cac8a90d6913a6a0391d805cddc9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5500329ab50804678fb8a90b96bf2a469bca16b620fb6dd2f5f5a17106e94898\" already present on machine,Source:EventSource{Component:kubelet,Host:master-0,},FirstTimestamp:2026-03-07 21:29:10.869860066 +0000 UTC m=+673.940565165,LastTimestamp:2026-03-07 21:29:10.869860066 +0000 UTC m=+673.940565165,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:master-0,}" Mar 07 21:29:20.657546 master-0 kubenswrapper[16352]: E0307 21:29:20.657176 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="800ms" Mar 07 21:29:21.459381 master-0 kubenswrapper[16352]: E0307 21:29:21.459210 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="1.6s" Mar 07 21:29:23.060549 master-0 kubenswrapper[16352]: E0307 21:29:23.060407 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="3.2s" Mar 07 21:29:25.189337 master-0 kubenswrapper[16352]: I0307 21:29:25.189216 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:25.192641 master-0 kubenswrapper[16352]: I0307 21:29:25.192539 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:25.193870 master-0 kubenswrapper[16352]: I0307 21:29:25.193817 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:25.228874 master-0 kubenswrapper[16352]: I0307 21:29:25.228790 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:25.228874 master-0 kubenswrapper[16352]: I0307 21:29:25.228847 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:25.230031 master-0 kubenswrapper[16352]: E0307 21:29:25.229955 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:25.231026 master-0 kubenswrapper[16352]: I0307 21:29:25.230972 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:25.292873 master-0 kubenswrapper[16352]: W0307 21:29:25.292770 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48512e02022680c9d90092634f0fc146.slice/crio-8ddfa72353982cdcb6578a18b0e558c121088844387a74bdb9997c49c68018d3 WatchSource:0}: Error finding container 8ddfa72353982cdcb6578a18b0e558c121088844387a74bdb9997c49c68018d3: Status 404 returned error can't find the container with id 8ddfa72353982cdcb6578a18b0e558c121088844387a74bdb9997c49c68018d3 Mar 07 21:29:25.520024 master-0 kubenswrapper[16352]: I0307 21:29:25.519967 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0603fd6b1c0835ce0492441d0e22b91c/kube-controller-manager/0.log" Mar 07 21:29:25.520243 master-0 kubenswrapper[16352]: I0307 21:29:25.520026 16352 generic.go:334] "Generic (PLEG): container finished" podID="0603fd6b1c0835ce0492441d0e22b91c" containerID="61065fbb8706e680fd9d4b55e3fa79db6ff07a75715966e4571013cc2b0ff419" exitCode=1 Mar 07 21:29:25.520243 master-0 kubenswrapper[16352]: I0307 21:29:25.520093 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerDied","Data":"61065fbb8706e680fd9d4b55e3fa79db6ff07a75715966e4571013cc2b0ff419"} Mar 07 21:29:25.520728 master-0 kubenswrapper[16352]: I0307 21:29:25.520673 16352 scope.go:117] "RemoveContainer" containerID="61065fbb8706e680fd9d4b55e3fa79db6ff07a75715966e4571013cc2b0ff419" Mar 07 21:29:25.521299 master-0 kubenswrapper[16352]: I0307 21:29:25.521230 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:25.521369 master-0 kubenswrapper[16352]: I0307 21:29:25.521321 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"8ddfa72353982cdcb6578a18b0e558c121088844387a74bdb9997c49c68018d3"} Mar 07 21:29:25.521980 master-0 kubenswrapper[16352]: I0307 21:29:25.521850 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:25.522521 master-0 kubenswrapper[16352]: I0307 21:29:25.522467 16352 status_manager.go:851] "Failed to get status for pod" podUID="0603fd6b1c0835ce0492441d0e22b91c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.262154 master-0 kubenswrapper[16352]: E0307 21:29:26.262040 16352 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.sno.openstack.lab:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/master-0?timeout=10s\": dial tcp 192.168.32.10:6443: connect: connection refused" interval="6.4s" Mar 07 21:29:26.542072 master-0 kubenswrapper[16352]: I0307 21:29:26.541879 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-master-0_0603fd6b1c0835ce0492441d0e22b91c/kube-controller-manager/0.log" Mar 07 21:29:26.542072 master-0 kubenswrapper[16352]: I0307 21:29:26.542012 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" event={"ID":"0603fd6b1c0835ce0492441d0e22b91c","Type":"ContainerStarted","Data":"c4047014c65875c118ab40cd5c7ae7c100c02b50a57cbc97deaac5945e46291d"} Mar 07 21:29:26.543950 master-0 kubenswrapper[16352]: I0307 21:29:26.543908 16352 status_manager.go:851] "Failed to get status for pod" podUID="0603fd6b1c0835ce0492441d0e22b91c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.544448 master-0 kubenswrapper[16352]: I0307 21:29:26.544407 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.545221 master-0 kubenswrapper[16352]: I0307 21:29:26.545147 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.546186 master-0 kubenswrapper[16352]: I0307 21:29:26.546130 16352 generic.go:334] "Generic (PLEG): container finished" podID="48512e02022680c9d90092634f0fc146" containerID="412f2fd4f0929967b1f4cd18029bf8a435c0196c2807c0d6e091be43552d4e26" exitCode=0 Mar 07 21:29:26.546382 master-0 kubenswrapper[16352]: I0307 21:29:26.546262 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerDied","Data":"412f2fd4f0929967b1f4cd18029bf8a435c0196c2807c0d6e091be43552d4e26"} Mar 07 21:29:26.546593 master-0 kubenswrapper[16352]: I0307 21:29:26.546548 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:26.546593 master-0 kubenswrapper[16352]: I0307 21:29:26.546586 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:26.547255 master-0 kubenswrapper[16352]: E0307 21:29:26.547197 16352 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:26.547348 master-0 kubenswrapper[16352]: I0307 21:29:26.547290 16352 status_manager.go:851] "Failed to get status for pod" podUID="0603fd6b1c0835ce0492441d0e22b91c" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.548119 master-0 kubenswrapper[16352]: I0307 21:29:26.548050 16352 status_manager.go:851] "Failed to get status for pod" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" pod="openshift-kube-apiserver/installer-4-retry-1-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-4-retry-1-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:26.548832 master-0 kubenswrapper[16352]: I0307 21:29:26.548780 16352 status_manager.go:851] "Failed to get status for pod" podUID="3a18cac8a90d6913a6a0391d805cddc9" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" err="Get \"https://api-int.sno.openstack.lab:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-master-0\": dial tcp 192.168.32.10:6443: connect: connection refused" Mar 07 21:29:27.569336 master-0 kubenswrapper[16352]: I0307 21:29:27.569255 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"2b5db4540c95e94b034fde60c04234c248c3a92db5c859e866c0ce76e0f3abad"} Mar 07 21:29:27.569336 master-0 kubenswrapper[16352]: I0307 21:29:27.569330 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"7f184f129df52a2d993eee694a4de12ae91cdd2528aebc2feebdbecc6d9f3c7f"} Mar 07 21:29:27.569336 master-0 kubenswrapper[16352]: I0307 21:29:27.569342 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"505ab3d3674d813d252fbc720134bb5b149da366ec98832987e43f96b7108057"} Mar 07 21:29:28.576501 master-0 kubenswrapper[16352]: I0307 21:29:28.576429 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:28.577020 master-0 kubenswrapper[16352]: I0307 21:29:28.576541 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:28.598705 master-0 kubenswrapper[16352]: I0307 21:29:28.597622 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"bff2ac307e2a4f1b3beff6ef84084fcce89b3a10df0e1aa925162a385112824e"} Mar 07 21:29:28.598705 master-0 kubenswrapper[16352]: I0307 21:29:28.597715 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" event={"ID":"48512e02022680c9d90092634f0fc146","Type":"ContainerStarted","Data":"526b2986004caa8a724ae707815c69674324247416d86fb8b3e81cfb5d472208"} Mar 07 21:29:28.598705 master-0 kubenswrapper[16352]: I0307 21:29:28.598046 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:28.598705 master-0 kubenswrapper[16352]: I0307 21:29:28.598136 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:28.598705 master-0 kubenswrapper[16352]: I0307 21:29:28.598184 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:30.231842 master-0 kubenswrapper[16352]: I0307 21:29:30.231748 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:30.231842 master-0 kubenswrapper[16352]: I0307 21:29:30.231834 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:30.241022 master-0 kubenswrapper[16352]: I0307 21:29:30.240959 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:31.270533 master-0 kubenswrapper[16352]: I0307 21:29:31.270410 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:29:31.270533 master-0 kubenswrapper[16352]: I0307 21:29:31.270524 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:29:31.277519 master-0 kubenswrapper[16352]: I0307 21:29:31.277450 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:29:31.376781 master-0 kubenswrapper[16352]: I0307 21:29:31.376499 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-master-0" Mar 07 21:29:33.621503 master-0 kubenswrapper[16352]: I0307 21:29:33.621360 16352 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:33.652027 master-0 kubenswrapper[16352]: I0307 21:29:33.651873 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:33.652027 master-0 kubenswrapper[16352]: I0307 21:29:33.651927 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:33.656576 master-0 kubenswrapper[16352]: I0307 21:29:33.656530 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:33.659479 master-0 kubenswrapper[16352]: I0307 21:29:33.659431 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="48512e02022680c9d90092634f0fc146" podUID="29816f9f-fbb6-4ab4-ba6b-c0a0645dd144" Mar 07 21:29:34.662296 master-0 kubenswrapper[16352]: I0307 21:29:34.662232 16352 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:34.663267 master-0 kubenswrapper[16352]: I0307 21:29:34.662917 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-master-0" podUID="2672ed3b-651d-4397-a52b-7e9443865164" Mar 07 21:29:37.223059 master-0 kubenswrapper[16352]: I0307 21:29:37.222975 16352 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-master-0" oldPodUID="48512e02022680c9d90092634f0fc146" podUID="29816f9f-fbb6-4ab4-ba6b-c0a0645dd144" Mar 07 21:29:38.577210 master-0 kubenswrapper[16352]: I0307 21:29:38.577114 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:38.578145 master-0 kubenswrapper[16352]: I0307 21:29:38.577225 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:41.278891 master-0 kubenswrapper[16352]: I0307 21:29:41.278783 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-master-0" Mar 07 21:29:43.385106 master-0 kubenswrapper[16352]: I0307 21:29:43.384975 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"openshift-service-ca.crt" Mar 07 21:29:43.581299 master-0 kubenswrapper[16352]: I0307 21:29:43.581208 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 07 21:29:43.930596 master-0 kubenswrapper[16352]: I0307 21:29:43.930515 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 07 21:29:43.970665 master-0 kubenswrapper[16352]: I0307 21:29:43.970508 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-dockercfg-5m62w" Mar 07 21:29:44.017480 master-0 kubenswrapper[16352]: I0307 21:29:44.017317 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 07 21:29:44.225660 master-0 kubenswrapper[16352]: I0307 21:29:44.225488 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 07 21:29:44.457363 master-0 kubenswrapper[16352]: I0307 21:29:44.457271 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 07 21:29:44.900529 master-0 kubenswrapper[16352]: I0307 21:29:44.900405 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"openshift-service-ca.crt" Mar 07 21:29:44.994000 master-0 kubenswrapper[16352]: I0307 21:29:44.993923 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 07 21:29:45.045634 master-0 kubenswrapper[16352]: I0307 21:29:45.045557 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 07 21:29:45.067213 master-0 kubenswrapper[16352]: I0307 21:29:45.067074 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 07 21:29:45.385128 master-0 kubenswrapper[16352]: I0307 21:29:45.385035 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"openshift-service-ca.crt" Mar 07 21:29:45.450435 master-0 kubenswrapper[16352]: I0307 21:29:45.450321 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 07 21:29:45.567673 master-0 kubenswrapper[16352]: I0307 21:29:45.567586 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 07 21:29:45.780197 master-0 kubenswrapper[16352]: I0307 21:29:45.779945 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"node-tuning-operator-tls" Mar 07 21:29:45.819971 master-0 kubenswrapper[16352]: I0307 21:29:45.819882 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 07 21:29:46.009569 master-0 kubenswrapper[16352]: I0307 21:29:46.009497 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 07 21:29:46.138361 master-0 kubenswrapper[16352]: I0307 21:29:46.138200 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-client-certs" Mar 07 21:29:46.194052 master-0 kubenswrapper[16352]: I0307 21:29:46.193946 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:29:46.226573 master-0 kubenswrapper[16352]: I0307 21:29:46.226448 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-metrics" Mar 07 21:29:46.259317 master-0 kubenswrapper[16352]: I0307 21:29:46.259233 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-cdmkh" Mar 07 21:29:46.388152 master-0 kubenswrapper[16352]: I0307 21:29:46.387983 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-fswfb" Mar 07 21:29:46.407452 master-0 kubenswrapper[16352]: I0307 21:29:46.407265 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-tls" Mar 07 21:29:46.622373 master-0 kubenswrapper[16352]: I0307 21:29:46.622206 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-operator-tls" Mar 07 21:29:46.640254 master-0 kubenswrapper[16352]: I0307 21:29:46.640122 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 07 21:29:46.654671 master-0 kubenswrapper[16352]: I0307 21:29:46.654610 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-olm-operator"/"cluster-olm-operator-serving-cert" Mar 07 21:29:46.793826 master-0 kubenswrapper[16352]: I0307 21:29:46.793562 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-cert" Mar 07 21:29:46.803084 master-0 kubenswrapper[16352]: I0307 21:29:46.802964 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-kube-rbac-proxy-config" Mar 07 21:29:46.811826 master-0 kubenswrapper[16352]: I0307 21:29:46.811769 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 07 21:29:46.827847 master-0 kubenswrapper[16352]: I0307 21:29:46.827755 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 07 21:29:46.868808 master-0 kubenswrapper[16352]: I0307 21:29:46.868666 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-w2xft" Mar 07 21:29:46.889597 master-0 kubenswrapper[16352]: I0307 21:29:46.889503 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 07 21:29:46.891945 master-0 kubenswrapper[16352]: I0307 21:29:46.891869 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"openshift-insights-serving-cert" Mar 07 21:29:46.934417 master-0 kubenswrapper[16352]: I0307 21:29:46.934342 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 07 21:29:46.966257 master-0 kubenswrapper[16352]: I0307 21:29:46.966057 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"kube-root-ca.crt" Mar 07 21:29:47.092727 master-0 kubenswrapper[16352]: I0307 21:29:47.092507 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 07 21:29:47.178823 master-0 kubenswrapper[16352]: I0307 21:29:47.178667 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"default-dockercfg-q695j" Mar 07 21:29:47.237161 master-0 kubenswrapper[16352]: I0307 21:29:47.237046 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"whereabouts-config" Mar 07 21:29:47.295283 master-0 kubenswrapper[16352]: I0307 21:29:47.295214 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-df95k" Mar 07 21:29:47.296667 master-0 kubenswrapper[16352]: I0307 21:29:47.296617 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 07 21:29:47.314650 master-0 kubenswrapper[16352]: I0307 21:29:47.314605 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:29:47.347593 master-0 kubenswrapper[16352]: I0307 21:29:47.347418 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 07 21:29:47.416160 master-0 kubenswrapper[16352]: I0307 21:29:47.416031 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 07 21:29:47.501477 master-0 kubenswrapper[16352]: I0307 21:29:47.501359 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 07 21:29:47.502495 master-0 kubenswrapper[16352]: I0307 21:29:47.502423 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:29:47.587038 master-0 kubenswrapper[16352]: I0307 21:29:47.586950 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-tls" Mar 07 21:29:47.677076 master-0 kubenswrapper[16352]: I0307 21:29:47.676836 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"openshift-service-ca.crt" Mar 07 21:29:47.774292 master-0 kubenswrapper[16352]: I0307 21:29:47.774187 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 07 21:29:47.779452 master-0 kubenswrapper[16352]: I0307 21:29:47.779394 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 07 21:29:47.893039 master-0 kubenswrapper[16352]: I0307 21:29:47.892919 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 07 21:29:47.941111 master-0 kubenswrapper[16352]: I0307 21:29:47.940896 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-tls" Mar 07 21:29:47.978382 master-0 kubenswrapper[16352]: I0307 21:29:47.978302 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 07 21:29:47.991318 master-0 kubenswrapper[16352]: I0307 21:29:47.991203 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 07 21:29:48.017506 master-0 kubenswrapper[16352]: I0307 21:29:48.017409 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 07 21:29:48.062867 master-0 kubenswrapper[16352]: I0307 21:29:48.062781 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 07 21:29:48.080733 master-0 kubenswrapper[16352]: I0307 21:29:48.080658 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 07 21:29:48.111060 master-0 kubenswrapper[16352]: I0307 21:29:48.110998 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 07 21:29:48.158500 master-0 kubenswrapper[16352]: I0307 21:29:48.158436 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 07 21:29:48.202115 master-0 kubenswrapper[16352]: I0307 21:29:48.201892 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 07 21:29:48.210013 master-0 kubenswrapper[16352]: I0307 21:29:48.209966 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-mnvfv" Mar 07 21:29:48.216479 master-0 kubenswrapper[16352]: I0307 21:29:48.216437 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-root-ca.crt" Mar 07 21:29:48.338086 master-0 kubenswrapper[16352]: I0307 21:29:48.338036 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-olm-operator"/"kube-root-ca.crt" Mar 07 21:29:48.346066 master-0 kubenswrapper[16352]: I0307 21:29:48.346034 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 07 21:29:48.367245 master-0 kubenswrapper[16352]: I0307 21:29:48.367189 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 07 21:29:48.367722 master-0 kubenswrapper[16352]: I0307 21:29:48.367664 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"baremetal-kube-rbac-proxy" Mar 07 21:29:48.437290 master-0 kubenswrapper[16352]: I0307 21:29:48.437204 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 07 21:29:48.452290 master-0 kubenswrapper[16352]: I0307 21:29:48.452148 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 07 21:29:48.514816 master-0 kubenswrapper[16352]: I0307 21:29:48.514347 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 07 21:29:48.521428 master-0 kubenswrapper[16352]: I0307 21:29:48.521301 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 07 21:29:48.571278 master-0 kubenswrapper[16352]: I0307 21:29:48.570966 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-catalogd"/"catalogserver-cert" Mar 07 21:29:48.577049 master-0 kubenswrapper[16352]: I0307 21:29:48.576991 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:48.577216 master-0 kubenswrapper[16352]: I0307 21:29:48.577055 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:48.600082 master-0 kubenswrapper[16352]: I0307 21:29:48.599979 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 07 21:29:48.627034 master-0 kubenswrapper[16352]: I0307 21:29:48.626928 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 07 21:29:48.628484 master-0 kubenswrapper[16352]: I0307 21:29:48.628414 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 07 21:29:48.631509 master-0 kubenswrapper[16352]: I0307 21:29:48.631444 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 07 21:29:48.667932 master-0 kubenswrapper[16352]: I0307 21:29:48.664741 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-kube-rbac-proxy-config" Mar 07 21:29:48.674848 master-0 kubenswrapper[16352]: I0307 21:29:48.673936 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-kube-rbac-proxy-config" Mar 07 21:29:48.717054 master-0 kubenswrapper[16352]: I0307 21:29:48.716780 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"kube-root-ca.crt" Mar 07 21:29:48.806922 master-0 kubenswrapper[16352]: I0307 21:29:48.806830 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-dockercfg-7982x" Mar 07 21:29:48.890417 master-0 kubenswrapper[16352]: I0307 21:29:48.890307 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 07 21:29:48.932724 master-0 kubenswrapper[16352]: I0307 21:29:48.909710 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 07 21:29:48.953094 master-0 kubenswrapper[16352]: I0307 21:29:48.952993 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 07 21:29:48.994309 master-0 kubenswrapper[16352]: I0307 21:29:48.994115 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 07 21:29:49.017451 master-0 kubenswrapper[16352]: I0307 21:29:49.017355 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 07 21:29:49.136029 master-0 kubenswrapper[16352]: I0307 21:29:49.135922 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 07 21:29:49.229090 master-0 kubenswrapper[16352]: I0307 21:29:49.229001 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 07 21:29:49.236755 master-0 kubenswrapper[16352]: I0307 21:29:49.236429 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"cluster-monitoring-operator-tls" Mar 07 21:29:49.287498 master-0 kubenswrapper[16352]: I0307 21:29:49.287135 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kube-state-metrics-custom-resource-state-configmap" Mar 07 21:29:49.326966 master-0 kubenswrapper[16352]: I0307 21:29:49.326880 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 07 21:29:49.372818 master-0 kubenswrapper[16352]: I0307 21:29:49.372741 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 07 21:29:49.376955 master-0 kubenswrapper[16352]: I0307 21:29:49.376879 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-2tlv4" Mar 07 21:29:49.413726 master-0 kubenswrapper[16352]: I0307 21:29:49.413613 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-tls" Mar 07 21:29:49.414291 master-0 kubenswrapper[16352]: I0307 21:29:49.414220 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 07 21:29:49.481455 master-0 kubenswrapper[16352]: I0307 21:29:49.481381 16352 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 07 21:29:49.487515 master-0 kubenswrapper[16352]: I0307 21:29:49.487419 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podStartSLOduration=39.487398649 podStartE2EDuration="39.487398649s" podCreationTimestamp="2026-03-07 21:29:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:29:33.415849811 +0000 UTC m=+696.486554870" watchObservedRunningTime="2026-03-07 21:29:49.487398649 +0000 UTC m=+712.558103698" Mar 07 21:29:49.489607 master-0 kubenswrapper[16352]: I0307 21:29:49.489574 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:29:49.489607 master-0 kubenswrapper[16352]: I0307 21:29:49.489623 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-master-0"] Mar 07 21:29:49.495403 master-0 kubenswrapper[16352]: I0307 21:29:49.495328 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-master-0" Mar 07 21:29:49.525944 master-0 kubenswrapper[16352]: I0307 21:29:49.525821 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-master-0" podStartSLOduration=16.525794325 podStartE2EDuration="16.525794325s" podCreationTimestamp="2026-03-07 21:29:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:29:49.523395328 +0000 UTC m=+712.594100407" watchObservedRunningTime="2026-03-07 21:29:49.525794325 +0000 UTC m=+712.596499394" Mar 07 21:29:49.558813 master-0 kubenswrapper[16352]: I0307 21:29:49.558566 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:29:49.773818 master-0 kubenswrapper[16352]: I0307 21:29:49.773747 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 07 21:29:49.819804 master-0 kubenswrapper[16352]: I0307 21:29:49.819631 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 07 21:29:49.824145 master-0 kubenswrapper[16352]: I0307 21:29:49.824072 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 07 21:29:49.872815 master-0 kubenswrapper[16352]: I0307 21:29:49.872719 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 07 21:29:49.909053 master-0 kubenswrapper[16352]: I0307 21:29:49.908952 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"openshift-service-ca.crt" Mar 07 21:29:49.924038 master-0 kubenswrapper[16352]: I0307 21:29:49.923954 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-7fv8q" Mar 07 21:29:49.957292 master-0 kubenswrapper[16352]: I0307 21:29:49.957224 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 07 21:29:50.042617 master-0 kubenswrapper[16352]: I0307 21:29:50.042528 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 07 21:29:50.047763 master-0 kubenswrapper[16352]: I0307 21:29:50.047663 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 07 21:29:50.049814 master-0 kubenswrapper[16352]: I0307 21:29:50.049783 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-kube-rbac-proxy-config" Mar 07 21:29:50.123467 master-0 kubenswrapper[16352]: I0307 21:29:50.123377 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 07 21:29:50.127623 master-0 kubenswrapper[16352]: I0307 21:29:50.127563 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemetry-config" Mar 07 21:29:50.156467 master-0 kubenswrapper[16352]: I0307 21:29:50.156305 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-kd5ps" Mar 07 21:29:50.160356 master-0 kubenswrapper[16352]: I0307 21:29:50.160297 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 07 21:29:50.164086 master-0 kubenswrapper[16352]: I0307 21:29:50.163995 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 07 21:29:50.233968 master-0 kubenswrapper[16352]: I0307 21:29:50.233860 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 07 21:29:50.261924 master-0 kubenswrapper[16352]: I0307 21:29:50.261855 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 07 21:29:50.311213 master-0 kubenswrapper[16352]: I0307 21:29:50.311077 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 07 21:29:50.343403 master-0 kubenswrapper[16352]: I0307 21:29:50.343304 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 07 21:29:50.379909 master-0 kubenswrapper[16352]: I0307 21:29:50.379664 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-6xbgq" Mar 07 21:29:50.380288 master-0 kubenswrapper[16352]: I0307 21:29:50.379940 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"kube-root-ca.crt" Mar 07 21:29:50.383211 master-0 kubenswrapper[16352]: I0307 21:29:50.383163 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 07 21:29:50.391503 master-0 kubenswrapper[16352]: I0307 21:29:50.391416 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"kubelet-serving-ca-bundle" Mar 07 21:29:50.476156 master-0 kubenswrapper[16352]: I0307 21:29:50.476075 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 07 21:29:50.482673 master-0 kubenswrapper[16352]: I0307 21:29:50.482633 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 07 21:29:50.486599 master-0 kubenswrapper[16352]: I0307 21:29:50.486540 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 07 21:29:50.496119 master-0 kubenswrapper[16352]: I0307 21:29:50.496048 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 07 21:29:50.520014 master-0 kubenswrapper[16352]: I0307 21:29:50.519907 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 07 21:29:50.569381 master-0 kubenswrapper[16352]: I0307 21:29:50.569319 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 07 21:29:50.582411 master-0 kubenswrapper[16352]: I0307 21:29:50.582348 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 07 21:29:50.597069 master-0 kubenswrapper[16352]: I0307 21:29:50.596995 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"federate-client-certs" Mar 07 21:29:50.615094 master-0 kubenswrapper[16352]: I0307 21:29:50.615007 16352 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 07 21:29:50.652233 master-0 kubenswrapper[16352]: I0307 21:29:50.652025 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-client-ca" Mar 07 21:29:50.764570 master-0 kubenswrapper[16352]: I0307 21:29:50.764472 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 07 21:29:50.777044 master-0 kubenswrapper[16352]: I0307 21:29:50.776995 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"monitoring-plugin-cert" Mar 07 21:29:50.794810 master-0 kubenswrapper[16352]: I0307 21:29:50.794731 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 07 21:29:50.797100 master-0 kubenswrapper[16352]: I0307 21:29:50.797037 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"default-dockercfg-tpl92" Mar 07 21:29:50.817731 master-0 kubenswrapper[16352]: I0307 21:29:50.817605 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"openshift-state-metrics-kube-rbac-proxy-config" Mar 07 21:29:50.825503 master-0 kubenswrapper[16352]: I0307 21:29:50.825432 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"kube-rbac-proxy" Mar 07 21:29:50.904463 master-0 kubenswrapper[16352]: I0307 21:29:50.904230 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 07 21:29:50.927329 master-0 kubenswrapper[16352]: I0307 21:29:50.927240 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 07 21:29:51.024316 master-0 kubenswrapper[16352]: I0307 21:29:51.024218 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 07 21:29:51.086426 master-0 kubenswrapper[16352]: I0307 21:29:51.086339 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 07 21:29:51.116895 master-0 kubenswrapper[16352]: I0307 21:29:51.116822 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 07 21:29:51.148649 master-0 kubenswrapper[16352]: I0307 21:29:51.148580 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 07 21:29:51.259567 master-0 kubenswrapper[16352]: I0307 21:29:51.259374 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-3j1qmkmjalrq1" Mar 07 21:29:51.305046 master-0 kubenswrapper[16352]: I0307 21:29:51.304969 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 07 21:29:51.357877 master-0 kubenswrapper[16352]: I0307 21:29:51.357757 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 07 21:29:51.393597 master-0 kubenswrapper[16352]: I0307 21:29:51.393532 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 07 21:29:51.394351 master-0 kubenswrapper[16352]: I0307 21:29:51.394294 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-tls" Mar 07 21:29:51.401030 master-0 kubenswrapper[16352]: I0307 21:29:51.400985 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 07 21:29:51.435413 master-0 kubenswrapper[16352]: I0307 21:29:51.435320 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-admission-webhook-tls" Mar 07 21:29:51.534197 master-0 kubenswrapper[16352]: I0307 21:29:51.533846 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"openshift-service-ca.crt" Mar 07 21:29:51.591786 master-0 kubenswrapper[16352]: I0307 21:29:51.591069 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 07 21:29:51.757865 master-0 kubenswrapper[16352]: I0307 21:29:51.757771 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"node-exporter-dockercfg-jtttv" Mar 07 21:29:51.772438 master-0 kubenswrapper[16352]: I0307 21:29:51.772339 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-state-metrics-dockercfg-xvgqm" Mar 07 21:29:51.827114 master-0 kubenswrapper[16352]: I0307 21:29:51.826907 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-baremetal-webhook-server-cert" Mar 07 21:29:51.903505 master-0 kubenswrapper[16352]: I0307 21:29:51.903393 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"trusted-ca-bundle" Mar 07 21:29:51.909335 master-0 kubenswrapper[16352]: I0307 21:29:51.909277 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 07 21:29:51.975589 master-0 kubenswrapper[16352]: I0307 21:29:51.975463 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-dockercfg-gvhc4" Mar 07 21:29:52.056662 master-0 kubenswrapper[16352]: I0307 21:29:52.056528 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-web" Mar 07 21:29:52.057786 master-0 kubenswrapper[16352]: I0307 21:29:52.057607 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 07 21:29:52.122774 master-0 kubenswrapper[16352]: I0307 21:29:52.122610 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 07 21:29:52.185525 master-0 kubenswrapper[16352]: I0307 21:29:52.185410 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy-cluster-autoscaler-operator" Mar 07 21:29:52.224477 master-0 kubenswrapper[16352]: I0307 21:29:52.224362 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cluster-cloud-controller-manager-dockercfg-lvvbn" Mar 07 21:29:52.314959 master-0 kubenswrapper[16352]: I0307 21:29:52.314845 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 07 21:29:52.320030 master-0 kubenswrapper[16352]: I0307 21:29:52.319945 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 07 21:29:52.341769 master-0 kubenswrapper[16352]: I0307 21:29:52.341634 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 07 21:29:52.380634 master-0 kubenswrapper[16352]: I0307 21:29:52.380471 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 07 21:29:52.415344 master-0 kubenswrapper[16352]: I0307 21:29:52.415265 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"openshift-service-ca.crt" Mar 07 21:29:52.464768 master-0 kubenswrapper[16352]: I0307 21:29:52.464666 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"metrics-server-dockercfg-wbb4k" Mar 07 21:29:52.567897 master-0 kubenswrapper[16352]: I0307 21:29:52.567787 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 07 21:29:52.591067 master-0 kubenswrapper[16352]: I0307 21:29:52.590989 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 07 21:29:52.628826 master-0 kubenswrapper[16352]: I0307 21:29:52.628747 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-psq7z" Mar 07 21:29:52.658198 master-0 kubenswrapper[16352]: I0307 21:29:52.658016 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-insights"/"operator-dockercfg-lbvsg" Mar 07 21:29:52.807299 master-0 kubenswrapper[16352]: I0307 21:29:52.807230 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-2z9v6" Mar 07 21:29:53.118499 master-0 kubenswrapper[16352]: I0307 21:29:53.118430 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 07 21:29:53.138854 master-0 kubenswrapper[16352]: I0307 21:29:53.138780 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 07 21:29:53.147735 master-0 kubenswrapper[16352]: I0307 21:29:53.147635 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 07 21:29:53.174215 master-0 kubenswrapper[16352]: I0307 21:29:53.174159 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 07 21:29:53.174846 master-0 kubenswrapper[16352]: I0307 21:29:53.174799 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 07 21:29:53.231597 master-0 kubenswrapper[16352]: I0307 21:29:53.231345 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h286h" Mar 07 21:29:53.237014 master-0 kubenswrapper[16352]: I0307 21:29:53.236141 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-grpc-tls-k9ktl35kg68d" Mar 07 21:29:53.298528 master-0 kubenswrapper[16352]: I0307 21:29:53.298439 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-node-tuning-operator"/"performance-addon-operator-webhook-cert" Mar 07 21:29:53.323819 master-0 kubenswrapper[16352]: I0307 21:29:53.323634 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 07 21:29:53.352148 master-0 kubenswrapper[16352]: I0307 21:29:53.352084 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 07 21:29:53.353802 master-0 kubenswrapper[16352]: I0307 21:29:53.353750 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 07 21:29:53.433206 master-0 kubenswrapper[16352]: I0307 21:29:53.433083 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 07 21:29:53.460316 master-0 kubenswrapper[16352]: I0307 21:29:53.460254 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 07 21:29:53.481013 master-0 kubenswrapper[16352]: I0307 21:29:53.480943 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 07 21:29:53.494903 master-0 kubenswrapper[16352]: I0307 21:29:53.494807 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-lj4zb" Mar 07 21:29:53.585427 master-0 kubenswrapper[16352]: I0307 21:29:53.585334 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 07 21:29:53.619176 master-0 kubenswrapper[16352]: I0307 21:29:53.619094 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 07 21:29:53.633407 master-0 kubenswrapper[16352]: I0307 21:29:53.633309 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 07 21:29:53.681655 master-0 kubenswrapper[16352]: I0307 21:29:53.681573 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"service-ca-bundle" Mar 07 21:29:53.695166 master-0 kubenswrapper[16352]: I0307 21:29:53.695018 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 07 21:29:53.735596 master-0 kubenswrapper[16352]: I0307 21:29:53.735481 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 07 21:29:53.841657 master-0 kubenswrapper[16352]: I0307 21:29:53.841540 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 07 21:29:53.940829 master-0 kubenswrapper[16352]: I0307 21:29:53.940746 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 07 21:29:53.970774 master-0 kubenswrapper[16352]: I0307 21:29:53.970569 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 07 21:29:54.006348 master-0 kubenswrapper[16352]: I0307 21:29:54.006247 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-trusted-ca-bundle-8i12ta5c71j38" Mar 07 21:29:54.075750 master-0 kubenswrapper[16352]: I0307 21:29:54.071594 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 07 21:29:54.186748 master-0 kubenswrapper[16352]: I0307 21:29:54.184893 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 07 21:29:54.296714 master-0 kubenswrapper[16352]: I0307 21:29:54.296468 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 07 21:29:54.391001 master-0 kubenswrapper[16352]: I0307 21:29:54.390933 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 07 21:29:54.529813 master-0 kubenswrapper[16352]: I0307 21:29:54.529631 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 07 21:29:54.664635 master-0 kubenswrapper[16352]: I0307 21:29:54.633572 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 07 21:29:54.912564 master-0 kubenswrapper[16352]: I0307 21:29:54.912478 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 07 21:29:55.129499 master-0 kubenswrapper[16352]: I0307 21:29:55.129383 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-tls" Mar 07 21:29:55.276602 master-0 kubenswrapper[16352]: I0307 21:29:55.276518 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-images" Mar 07 21:29:55.288161 master-0 kubenswrapper[16352]: I0307 21:29:55.288090 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 07 21:29:55.296431 master-0 kubenswrapper[16352]: I0307 21:29:55.296374 16352 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 07 21:29:55.297512 master-0 kubenswrapper[16352]: I0307 21:29:55.297452 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 07 21:29:55.326895 master-0 kubenswrapper[16352]: I0307 21:29:55.326799 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-l888p" Mar 07 21:29:55.387087 master-0 kubenswrapper[16352]: I0307 21:29:55.386820 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"trusted-ca" Mar 07 21:29:55.458772 master-0 kubenswrapper[16352]: I0307 21:29:55.458666 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-z5sb9" Mar 07 21:29:55.500178 master-0 kubenswrapper[16352]: I0307 21:29:55.500094 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 07 21:29:55.618044 master-0 kubenswrapper[16352]: I0307 21:29:55.617953 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-serving-cert" Mar 07 21:29:55.659517 master-0 kubenswrapper[16352]: I0307 21:29:55.659311 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 07 21:29:55.705736 master-0 kubenswrapper[16352]: I0307 21:29:55.705621 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 07 21:29:55.715952 master-0 kubenswrapper[16352]: I0307 21:29:55.715881 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 07 21:29:55.754946 master-0 kubenswrapper[16352]: I0307 21:29:55.754856 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 07 21:29:55.761505 master-0 kubenswrapper[16352]: I0307 21:29:55.761420 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 07 21:29:55.772617 master-0 kubenswrapper[16352]: I0307 21:29:55.772540 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 07 21:29:55.778048 master-0 kubenswrapper[16352]: I0307 21:29:55.777999 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 07 21:29:55.809433 master-0 kubenswrapper[16352]: I0307 21:29:55.809346 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-v8df8" Mar 07 21:29:56.032170 master-0 kubenswrapper[16352]: I0307 21:29:56.031985 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 07 21:29:56.050221 master-0 kubenswrapper[16352]: I0307 21:29:56.050143 16352 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:29:56.050588 master-0 kubenswrapper[16352]: I0307 21:29:56.050459 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" containerID="cri-o://3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf" gracePeriod=5 Mar 07 21:29:56.054046 master-0 kubenswrapper[16352]: I0307 21:29:56.053977 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 07 21:29:56.085330 master-0 kubenswrapper[16352]: I0307 21:29:56.085247 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-mkc28" Mar 07 21:29:56.141938 master-0 kubenswrapper[16352]: I0307 21:29:56.141826 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 07 21:29:56.184087 master-0 kubenswrapper[16352]: I0307 21:29:56.183807 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 07 21:29:56.209025 master-0 kubenswrapper[16352]: I0307 21:29:56.208709 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 07 21:29:56.302557 master-0 kubenswrapper[16352]: I0307 21:29:56.302392 16352 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 07 21:29:56.322372 master-0 kubenswrapper[16352]: I0307 21:29:56.322276 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"telemeter-client-serving-certs-ca-bundle" Mar 07 21:29:56.325884 master-0 kubenswrapper[16352]: I0307 21:29:56.325829 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-controller"/"operator-controller-trusted-ca-bundle" Mar 07 21:29:56.364844 master-0 kubenswrapper[16352]: I0307 21:29:56.364757 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"cluster-autoscaler-operator-dockercfg-wq5zr" Mar 07 21:29:56.366178 master-0 kubenswrapper[16352]: I0307 21:29:56.366106 16352 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 07 21:29:56.371610 master-0 kubenswrapper[16352]: I0307 21:29:56.371557 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 07 21:29:56.379729 master-0 kubenswrapper[16352]: I0307 21:29:56.379628 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 07 21:29:56.406704 master-0 kubenswrapper[16352]: I0307 21:29:56.406590 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 07 21:29:56.495999 master-0 kubenswrapper[16352]: I0307 21:29:56.495910 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 07 21:29:56.520597 master-0 kubenswrapper[16352]: I0307 21:29:56.520508 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 07 21:29:56.531186 master-0 kubenswrapper[16352]: I0307 21:29:56.531126 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 07 21:29:56.560479 master-0 kubenswrapper[16352]: I0307 21:29:56.560330 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"cluster-baremetal-operator-images" Mar 07 21:29:56.564303 master-0 kubenswrapper[16352]: I0307 21:29:56.564233 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-catalogd"/"catalogd-trusted-ca-bundle" Mar 07 21:29:56.653878 master-0 kubenswrapper[16352]: I0307 21:29:56.653788 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"cco-trusted-ca" Mar 07 21:29:56.668752 master-0 kubenswrapper[16352]: I0307 21:29:56.668627 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 07 21:29:56.721403 master-0 kubenswrapper[16352]: I0307 21:29:56.721303 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 07 21:29:56.783647 master-0 kubenswrapper[16352]: I0307 21:29:56.781360 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-x6w69" Mar 07 21:29:56.997069 master-0 kubenswrapper[16352]: I0307 21:29:56.996985 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-dockercfg-t2xgn" Mar 07 21:29:57.062171 master-0 kubenswrapper[16352]: I0307 21:29:57.062089 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 07 21:29:57.083200 master-0 kubenswrapper[16352]: I0307 21:29:57.083091 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 07 21:29:57.140052 master-0 kubenswrapper[16352]: I0307 21:29:57.139956 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cloud-credential-operator"/"openshift-service-ca.crt" Mar 07 21:29:57.159956 master-0 kubenswrapper[16352]: I0307 21:29:57.159841 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 07 21:29:57.171145 master-0 kubenswrapper[16352]: I0307 21:29:57.171090 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 07 21:29:57.240446 master-0 kubenswrapper[16352]: I0307 21:29:57.240244 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 07 21:29:57.391406 master-0 kubenswrapper[16352]: I0307 21:29:57.391327 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-hqsqr" Mar 07 21:29:57.401644 master-0 kubenswrapper[16352]: I0307 21:29:57.401578 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 07 21:29:57.659661 master-0 kubenswrapper[16352]: I0307 21:29:57.659527 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 07 21:29:57.763722 master-0 kubenswrapper[16352]: I0307 21:29:57.763624 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-node-tuning-operator"/"kube-root-ca.crt" Mar 07 21:29:57.779148 master-0 kubenswrapper[16352]: I0307 21:29:57.779078 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 07 21:29:57.787488 master-0 kubenswrapper[16352]: I0307 21:29:57.787392 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy-rules" Mar 07 21:29:57.845146 master-0 kubenswrapper[16352]: I0307 21:29:57.845060 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-storage-operator"/"cluster-storage-operator-dockercfg-b6gqw" Mar 07 21:29:57.904328 master-0 kubenswrapper[16352]: I0307 21:29:57.904192 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 07 21:29:57.990314 master-0 kubenswrapper[16352]: I0307 21:29:57.990147 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 07 21:29:58.017919 master-0 kubenswrapper[16352]: I0307 21:29:58.017837 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-storage-operator"/"kube-root-ca.crt" Mar 07 21:29:58.069284 master-0 kubenswrapper[16352]: I0307 21:29:58.069182 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 07 21:29:58.258890 master-0 kubenswrapper[16352]: I0307 21:29:58.258727 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 07 21:29:58.306798 master-0 kubenswrapper[16352]: I0307 21:29:58.306710 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 07 21:29:58.503884 master-0 kubenswrapper[16352]: I0307 21:29:58.503793 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 07 21:29:58.524977 master-0 kubenswrapper[16352]: I0307 21:29:58.524784 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 07 21:29:58.577053 master-0 kubenswrapper[16352]: I0307 21:29:58.576954 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:29:58.577053 master-0 kubenswrapper[16352]: I0307 21:29:58.577033 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:29:58.675757 master-0 kubenswrapper[16352]: I0307 21:29:58.675657 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 07 21:29:58.767230 master-0 kubenswrapper[16352]: I0307 21:29:58.767097 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-controller-manager-operator"/"cloud-controller-manager-operator-tls" Mar 07 21:29:58.863671 master-0 kubenswrapper[16352]: I0307 21:29:58.863526 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 07 21:29:58.926290 master-0 kubenswrapper[16352]: I0307 21:29:58.926167 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"kube-root-ca.crt" Mar 07 21:29:58.954873 master-0 kubenswrapper[16352]: I0307 21:29:58.954223 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-insights"/"openshift-service-ca.crt" Mar 07 21:29:59.155932 master-0 kubenswrapper[16352]: I0307 21:29:59.155714 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 07 21:29:59.176220 master-0 kubenswrapper[16352]: I0307 21:29:59.176114 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 07 21:29:59.208180 master-0 kubenswrapper[16352]: I0307 21:29:59.208056 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 07 21:29:59.225984 master-0 kubenswrapper[16352]: I0307 21:29:59.225867 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"thanos-querier-kube-rbac-proxy" Mar 07 21:29:59.544163 master-0 kubenswrapper[16352]: I0307 21:29:59.543986 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cloud-credential-operator"/"cloud-credential-operator-serving-cert" Mar 07 21:29:59.658169 master-0 kubenswrapper[16352]: I0307 21:29:59.658083 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 07 21:30:01.667966 master-0 kubenswrapper[16352]: I0307 21:30:01.667836 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_3a18cac8a90d6913a6a0391d805cddc9/startup-monitor/0.log" Mar 07 21:30:01.668985 master-0 kubenswrapper[16352]: I0307 21:30:01.668003 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:30:01.787940 master-0 kubenswrapper[16352]: I0307 21:30:01.787824 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 07 21:30:01.787940 master-0 kubenswrapper[16352]: I0307 21:30:01.787914 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 07 21:30:01.787940 master-0 kubenswrapper[16352]: I0307 21:30:01.787958 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 07 21:30:01.788415 master-0 kubenswrapper[16352]: I0307 21:30:01.788025 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 07 21:30:01.788415 master-0 kubenswrapper[16352]: I0307 21:30:01.788014 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests" (OuterVolumeSpecName: "manifests") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:30:01.788415 master-0 kubenswrapper[16352]: I0307 21:30:01.788105 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") pod \"3a18cac8a90d6913a6a0391d805cddc9\" (UID: \"3a18cac8a90d6913a6a0391d805cddc9\") " Mar 07 21:30:01.788415 master-0 kubenswrapper[16352]: I0307 21:30:01.788125 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock" (OuterVolumeSpecName: "var-lock") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:30:01.788784 master-0 kubenswrapper[16352]: I0307 21:30:01.788666 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log" (OuterVolumeSpecName: "var-log") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:30:01.788957 master-0 kubenswrapper[16352]: I0307 21:30:01.788913 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:30:01.789638 master-0 kubenswrapper[16352]: I0307 21:30:01.789568 16352 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-manifests\") on node \"master-0\" DevicePath \"\"" Mar 07 21:30:01.789638 master-0 kubenswrapper[16352]: I0307 21:30:01.789620 16352 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-lock\") on node \"master-0\" DevicePath \"\"" Mar 07 21:30:01.789887 master-0 kubenswrapper[16352]: I0307 21:30:01.789643 16352 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-var-log\") on node \"master-0\" DevicePath \"\"" Mar 07 21:30:01.789887 master-0 kubenswrapper[16352]: I0307 21:30:01.789663 16352 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:30:01.797661 master-0 kubenswrapper[16352]: I0307 21:30:01.797505 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "3a18cac8a90d6913a6a0391d805cddc9" (UID: "3a18cac8a90d6913a6a0391d805cddc9"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:30:01.892036 master-0 kubenswrapper[16352]: I0307 21:30:01.891926 16352 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/3a18cac8a90d6913a6a0391d805cddc9-pod-resource-dir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:30:01.976733 master-0 kubenswrapper[16352]: I0307 21:30:01.976236 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-master-0_3a18cac8a90d6913a6a0391d805cddc9/startup-monitor/0.log" Mar 07 21:30:01.976733 master-0 kubenswrapper[16352]: I0307 21:30:01.976328 16352 generic.go:334] "Generic (PLEG): container finished" podID="3a18cac8a90d6913a6a0391d805cddc9" containerID="3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf" exitCode=137 Mar 07 21:30:01.976733 master-0 kubenswrapper[16352]: I0307 21:30:01.976479 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" Mar 07 21:30:01.976733 master-0 kubenswrapper[16352]: I0307 21:30:01.976540 16352 scope.go:117] "RemoveContainer" containerID="3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf" Mar 07 21:30:02.003247 master-0 kubenswrapper[16352]: I0307 21:30:02.003180 16352 scope.go:117] "RemoveContainer" containerID="3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf" Mar 07 21:30:02.004110 master-0 kubenswrapper[16352]: E0307 21:30:02.004036 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf\": container with ID starting with 3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf not found: ID does not exist" containerID="3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf" Mar 07 21:30:02.004237 master-0 kubenswrapper[16352]: I0307 21:30:02.004167 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf"} err="failed to get container status \"3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf\": rpc error: code = NotFound desc = could not find container \"3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf\": container with ID starting with 3509800a9411baf5b08b083f70b501166720a2f5f445c73e97e2591933dc0ccf not found: ID does not exist" Mar 07 21:30:03.203166 master-0 kubenswrapper[16352]: I0307 21:30:03.203042 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a18cac8a90d6913a6a0391d805cddc9" path="/var/lib/kubelet/pods/3a18cac8a90d6913a6a0391d805cddc9/volumes" Mar 07 21:30:03.204131 master-0 kubenswrapper[16352]: I0307 21:30:03.203560 16352 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" podUID="" Mar 07 21:30:03.232276 master-0 kubenswrapper[16352]: I0307 21:30:03.231944 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:30:03.232276 master-0 kubenswrapper[16352]: I0307 21:30:03.232089 16352 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="23480484-46c4-48ab-8dcd-415636648162" Mar 07 21:30:03.281102 master-0 kubenswrapper[16352]: I0307 21:30:03.241637 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0"] Mar 07 21:30:03.281102 master-0 kubenswrapper[16352]: I0307 21:30:03.241732 16352 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-master-0" mirrorPodUID="23480484-46c4-48ab-8dcd-415636648162" Mar 07 21:30:08.577302 master-0 kubenswrapper[16352]: I0307 21:30:08.577154 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:08.578487 master-0 kubenswrapper[16352]: I0307 21:30:08.577313 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:30:13.615485 master-0 kubenswrapper[16352]: I0307 21:30:13.615351 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"metrics-server-audit-profiles" Mar 07 21:30:15.665290 master-0 kubenswrapper[16352]: I0307 21:30:15.665240 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client-dockercfg-hmpjk" Mar 07 21:30:16.012768 master-0 kubenswrapper[16352]: I0307 21:30:16.012469 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 07 21:30:16.789861 master-0 kubenswrapper[16352]: I0307 21:30:16.789769 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 07 21:30:18.576907 master-0 kubenswrapper[16352]: I0307 21:30:18.576809 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:18.577565 master-0 kubenswrapper[16352]: I0307 21:30:18.576932 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:30:19.915199 master-0 kubenswrapper[16352]: I0307 21:30:19.915092 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-operator-tls" Mar 07 21:30:24.465762 master-0 kubenswrapper[16352]: I0307 21:30:24.465637 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 07 21:30:28.577595 master-0 kubenswrapper[16352]: I0307 21:30:28.577476 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:28.577595 master-0 kubenswrapper[16352]: I0307 21:30:28.577580 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:30:28.998274 master-0 kubenswrapper[16352]: I0307 21:30:28.998206 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 07 21:30:31.752396 master-0 kubenswrapper[16352]: I0307 21:30:31.752315 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 07 21:30:33.560544 master-0 kubenswrapper[16352]: I0307 21:30:33.560446 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 07 21:30:33.802020 master-0 kubenswrapper[16352]: I0307 21:30:33.801924 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 07 21:30:38.577935 master-0 kubenswrapper[16352]: I0307 21:30:38.577814 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:38.577935 master-0 kubenswrapper[16352]: I0307 21:30:38.577939 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:30:39.071909 master-0 kubenswrapper[16352]: I0307 21:30:39.071800 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"telemeter-client" Mar 07 21:30:48.576414 master-0 kubenswrapper[16352]: I0307 21:30:48.576308 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:48.576414 master-0 kubenswrapper[16352]: I0307 21:30:48.576391 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:30:58.577218 master-0 kubenswrapper[16352]: I0307 21:30:58.577110 16352 patch_prober.go:28] interesting pod/console-64d844fb5f-9b28j container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" start-of-body= Mar 07 21:30:58.578605 master-0 kubenswrapper[16352]: I0307 21:30:58.577215 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" probeResult="failure" output="Get \"https://10.128.0.91:8443/health\": dial tcp 10.128.0.91:8443: connect: connection refused" Mar 07 21:31:01.647450 master-0 kubenswrapper[16352]: I0307 21:31:01.647311 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: E0307 21:31:01.648274 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: I0307 21:31:01.648310 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: E0307 21:31:01.648383 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" containerName="installer" Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: I0307 21:31:01.648403 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" containerName="installer" Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: I0307 21:31:01.648726 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a18cac8a90d6913a6a0391d805cddc9" containerName="startup-monitor" Mar 07 21:31:01.648939 master-0 kubenswrapper[16352]: I0307 21:31:01.648802 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="3c0f8565-4b6b-42b2-835c-035812f033f6" containerName="installer" Mar 07 21:31:01.653741 master-0 kubenswrapper[16352]: I0307 21:31:01.653618 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.657993 master-0 kubenswrapper[16352]: I0307 21:31:01.657960 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-578bc8c86c-mczhd"] Mar 07 21:31:01.659429 master-0 kubenswrapper[16352]: I0307 21:31:01.659399 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.660671 master-0 kubenswrapper[16352]: I0307 21:31:01.660590 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-sidecar-tls" Mar 07 21:31:01.661043 master-0 kubenswrapper[16352]: I0307 21:31:01.660986 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls-assets-0" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661097 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"serving-certs-ca-bundle" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661180 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661397 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-tls" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661448 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-dockercfg-l4g8x" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661536 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-thanos-prometheus-http-client-file" Mar 07 21:31:01.661869 master-0 kubenswrapper[16352]: I0307 21:31:01.661576 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"kube-rbac-proxy" Mar 07 21:31:01.662465 master-0 kubenswrapper[16352]: I0307 21:31:01.662143 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-kube-rbac-proxy-web" Mar 07 21:31:01.662465 master-0 kubenswrapper[16352]: I0307 21:31:01.662318 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 07 21:31:01.662465 master-0 kubenswrapper[16352]: I0307 21:31:01.662363 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 07 21:31:01.662465 master-0 kubenswrapper[16352]: I0307 21:31:01.662442 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-web-config" Mar 07 21:31:01.663952 master-0 kubenswrapper[16352]: I0307 21:31:01.663902 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-8ltxz"] Mar 07 21:31:01.664368 master-0 kubenswrapper[16352]: I0307 21:31:01.664309 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"prometheus-k8s-grpc-tls-2ibssd7q5gl39" Mar 07 21:31:01.665563 master-0 kubenswrapper[16352]: I0307 21:31:01.665521 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 07 21:31:01.665716 master-0 kubenswrapper[16352]: I0307 21:31:01.665551 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.665856 master-0 kubenswrapper[16352]: I0307 21:31:01.665668 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 07 21:31:01.665916 master-0 kubenswrapper[16352]: I0307 21:31:01.665613 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 07 21:31:01.665961 master-0 kubenswrapper[16352]: I0307 21:31:01.665662 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 07 21:31:01.666025 master-0 kubenswrapper[16352]: I0307 21:31:01.665771 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 07 21:31:01.667522 master-0 kubenswrapper[16352]: I0307 21:31:01.667468 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-x8pfn" Mar 07 21:31:01.667909 master-0 kubenswrapper[16352]: I0307 21:31:01.667881 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 07 21:31:01.668159 master-0 kubenswrapper[16352]: I0307 21:31:01.667884 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668244 master-0 kubenswrapper[16352]: I0307 21:31:01.668168 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668244 master-0 kubenswrapper[16352]: I0307 21:31:01.668203 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668244 master-0 kubenswrapper[16352]: I0307 21:31:01.668231 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-dir\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668387 master-0 kubenswrapper[16352]: I0307 21:31:01.668255 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668387 master-0 kubenswrapper[16352]: I0307 21:31:01.668285 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668387 master-0 kubenswrapper[16352]: I0307 21:31:01.668312 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668387 master-0 kubenswrapper[16352]: I0307 21:31:01.668353 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668387 master-0 kubenswrapper[16352]: I0307 21:31:01.668386 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668411 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668442 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668464 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-router-certs\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668488 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668513 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668537 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66bqq\" (UniqueName: \"kubernetes.io/projected/aff8ea86-e11f-428e-94ee-fef45f3bd856-kube-api-access-66bqq\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668558 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668591 master-0 kubenswrapper[16352]: I0307 21:31:01.668582 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668604 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668632 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668675 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668723 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snghg\" (UniqueName: \"kubernetes.io/projected/13ae001a-27f0-4b15-a204-5eaffc4fd835-kube-api-access-snghg\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668753 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668784 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-error\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668809 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-login\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668852 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-session\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668878 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-policies\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668902 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-service-ca\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.668942 master-0 kubenswrapper[16352]: I0307 21:31:01.668934 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.668968 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13ae001a-27f0-4b15-a204-5eaffc4fd835-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.668991 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.669020 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-kube-api-access-t9cjt\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.669045 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.669068 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.669378 master-0 kubenswrapper[16352]: I0307 21:31:01.669337 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/cni-sysctl-allowlist-ds-rhtr2"] Mar 07 21:31:01.670337 master-0 kubenswrapper[16352]: I0307 21:31:01.670292 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.671419 master-0 kubenswrapper[16352]: I0307 21:31:01.671397 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 07 21:31:01.671620 master-0 kubenswrapper[16352]: I0307 21:31:01.671453 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 07 21:31:01.671775 master-0 kubenswrapper[16352]: I0307 21:31:01.671743 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-k8s-rulefiles-0" Mar 07 21:31:01.673537 master-0 kubenswrapper[16352]: I0307 21:31:01.673240 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 07 21:31:01.677860 master-0 kubenswrapper[16352]: I0307 21:31:01.677822 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:31:01.678045 master-0 kubenswrapper[16352]: I0307 21:31:01.677994 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"prometheus-trusted-ca-bundle" Mar 07 21:31:01.681097 master-0 kubenswrapper[16352]: I0307 21:31:01.681033 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.681287 master-0 kubenswrapper[16352]: I0307 21:31:01.681232 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 07 21:31:01.686437 master-0 kubenswrapper[16352]: I0307 21:31:01.686383 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-8ltxz"] Mar 07 21:31:01.690168 master-0 kubenswrapper[16352]: I0307 21:31:01.690129 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-sysctl-allowlist" Mar 07 21:31:01.690316 master-0 kubenswrapper[16352]: I0307 21:31:01.690250 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:31:01.690463 master-0 kubenswrapper[16352]: I0307 21:31:01.690438 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-metric" Mar 07 21:31:01.690520 master-0 kubenswrapper[16352]: I0307 21:31:01.690512 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy-web" Mar 07 21:31:01.691200 master-0 kubenswrapper[16352]: I0307 21:31:01.691173 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls-assets-0" Mar 07 21:31:01.691500 master-0 kubenswrapper[16352]: I0307 21:31:01.691476 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-kube-rbac-proxy" Mar 07 21:31:01.691609 master-0 kubenswrapper[16352]: I0307 21:31:01.691584 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-web-config" Mar 07 21:31:01.691784 master-0 kubenswrapper[16352]: I0307 21:31:01.691759 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-dockercfg-7bx66" Mar 07 21:31:01.691856 master-0 kubenswrapper[16352]: I0307 21:31:01.691796 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-tls" Mar 07 21:31:01.691906 master-0 kubenswrapper[16352]: I0307 21:31:01.691862 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-monitoring"/"alertmanager-main-generated" Mar 07 21:31:01.699383 master-0 kubenswrapper[16352]: I0307 21:31:01.699325 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 07 21:31:01.703926 master-0 kubenswrapper[16352]: I0307 21:31:01.703852 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-monitoring"/"alertmanager-trusted-ca-bundle" Mar 07 21:31:01.707701 master-0 kubenswrapper[16352]: I0307 21:31:01.707583 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-578bc8c86c-mczhd"] Mar 07 21:31:01.717838 master-0 kubenswrapper[16352]: I0307 21:31:01.716986 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:31:01.770790 master-0 kubenswrapper[16352]: I0307 21:31:01.770727 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13ae001a-27f0-4b15-a204-5eaffc4fd835-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.770876 master-0 kubenswrapper[16352]: I0307 21:31:01.770791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.770876 master-0 kubenswrapper[16352]: I0307 21:31:01.770837 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.770876 master-0 kubenswrapper[16352]: I0307 21:31:01.770865 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.771003 master-0 kubenswrapper[16352]: I0307 21:31:01.770896 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-kube-api-access-t9cjt\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.771003 master-0 kubenswrapper[16352]: I0307 21:31:01.770928 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.771003 master-0 kubenswrapper[16352]: I0307 21:31:01.770955 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.771141 master-0 kubenswrapper[16352]: I0307 21:31:01.771019 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.771141 master-0 kubenswrapper[16352]: I0307 21:31:01.771054 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.771141 master-0 kubenswrapper[16352]: I0307 21:31:01.771085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.772947 master-0 kubenswrapper[16352]: I0307 21:31:01.772914 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.774716 master-0 kubenswrapper[16352]: I0307 21:31:01.774651 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.774837 master-0 kubenswrapper[16352]: I0307 21:31:01.774809 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.774877 master-0 kubenswrapper[16352]: I0307 21:31:01.774853 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-dir\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.774914 master-0 kubenswrapper[16352]: I0307 21:31:01.774891 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.775024 master-0 kubenswrapper[16352]: I0307 21:31:01.774957 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775069 master-0 kubenswrapper[16352]: I0307 21:31:01.775052 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775195 master-0 kubenswrapper[16352]: I0307 21:31:01.775165 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.775300 master-0 kubenswrapper[16352]: I0307 21:31:01.775264 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775386 master-0 kubenswrapper[16352]: I0307 21:31:01.775365 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775421 master-0 kubenswrapper[16352]: I0307 21:31:01.775402 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.775452 master-0 kubenswrapper[16352]: I0307 21:31:01.775434 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775504 master-0 kubenswrapper[16352]: I0307 21:31:01.775488 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775537 master-0 kubenswrapper[16352]: I0307 21:31:01.775524 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-router-certs\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.775572 master-0 kubenswrapper[16352]: I0307 21:31:01.775561 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775607 master-0 kubenswrapper[16352]: I0307 21:31:01.775594 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.775640 master-0 kubenswrapper[16352]: I0307 21:31:01.775627 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775754 master-0 kubenswrapper[16352]: I0307 21:31:01.775664 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66bqq\" (UniqueName: \"kubernetes.io/projected/aff8ea86-e11f-428e-94ee-fef45f3bd856-kube-api-access-66bqq\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.775754 master-0 kubenswrapper[16352]: I0307 21:31:01.775709 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775754 master-0 kubenswrapper[16352]: I0307 21:31:01.775742 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775858 master-0 kubenswrapper[16352]: I0307 21:31:01.775778 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775858 master-0 kubenswrapper[16352]: I0307 21:31:01.775826 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.775919 master-0 kubenswrapper[16352]: I0307 21:31:01.775877 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-web-config\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.775956 master-0 kubenswrapper[16352]: I0307 21:31:01.775915 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrfp5\" (UniqueName: \"kubernetes.io/projected/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-kube-api-access-rrfp5\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.775985 master-0 kubenswrapper[16352]: I0307 21:31:01.775961 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.776020 master-0 kubenswrapper[16352]: I0307 21:31:01.776008 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776051 master-0 kubenswrapper[16352]: I0307 21:31:01.776038 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.776108 master-0 kubenswrapper[16352]: I0307 21:31:01.776089 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snghg\" (UniqueName: \"kubernetes.io/projected/13ae001a-27f0-4b15-a204-5eaffc4fd835-kube-api-access-snghg\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.776148 master-0 kubenswrapper[16352]: I0307 21:31:01.776126 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.776181 master-0 kubenswrapper[16352]: I0307 21:31:01.776161 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776224 master-0 kubenswrapper[16352]: I0307 21:31:01.776205 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-ready\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.776267 master-0 kubenswrapper[16352]: I0307 21:31:01.776238 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-error\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776313 master-0 kubenswrapper[16352]: I0307 21:31:01.776280 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-login\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776363 master-0 kubenswrapper[16352]: I0307 21:31:01.776314 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-config-out\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.776409 master-0 kubenswrapper[16352]: I0307 21:31:01.776367 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6hgx\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-kube-api-access-w6hgx\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.776452 master-0 kubenswrapper[16352]: I0307 21:31:01.776418 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-session\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776492 master-0 kubenswrapper[16352]: I0307 21:31:01.776452 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-policies\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776527 master-0 kubenswrapper[16352]: I0307 21:31:01.776488 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-service-ca\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.776527 master-0 kubenswrapper[16352]: I0307 21:31:01.776514 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.776598 master-0 kubenswrapper[16352]: I0307 21:31:01.776577 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.777151 master-0 kubenswrapper[16352]: I0307 21:31:01.777133 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/13ae001a-27f0-4b15-a204-5eaffc4fd835-webhook-certs\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.778097 master-0 kubenswrapper[16352]: I0307 21:31:01.777976 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-dir\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.778097 master-0 kubenswrapper[16352]: I0307 21:31:01.777984 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-serving-cert\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.778097 master-0 kubenswrapper[16352]: I0307 21:31:01.778074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.780558 master-0 kubenswrapper[16352]: I0307 21:31:01.780453 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.781112 master-0 kubenswrapper[16352]: I0307 21:31:01.781072 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-audit-policies\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.781422 master-0 kubenswrapper[16352]: I0307 21:31:01.781265 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config-out\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.781483 master-0 kubenswrapper[16352]: I0307 21:31:01.781408 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-service-ca\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.781520 master-0 kubenswrapper[16352]: I0307 21:31:01.781493 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.781619 master-0 kubenswrapper[16352]: I0307 21:31:01.781575 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-cliconfig\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.781900 master-0 kubenswrapper[16352]: I0307 21:31:01.781868 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.782723 master-0 kubenswrapper[16352]: I0307 21:31:01.782690 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.783384 master-0 kubenswrapper[16352]: I0307 21:31:01.783341 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.783840 master-0 kubenswrapper[16352]: I0307 21:31:01.783790 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.784346 master-0 kubenswrapper[16352]: I0307 21:31:01.784283 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.786026 master-0 kubenswrapper[16352]: I0307 21:31:01.785984 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.786173 master-0 kubenswrapper[16352]: I0307 21:31:01.786132 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.786214 master-0 kubenswrapper[16352]: I0307 21:31:01.786172 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.786660 master-0 kubenswrapper[16352]: I0307 21:31:01.786616 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.787366 master-0 kubenswrapper[16352]: I0307 21:31:01.787274 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.788132 master-0 kubenswrapper[16352]: I0307 21:31:01.788081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-web-config\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.788280 master-0 kubenswrapper[16352]: I0307 21:31:01.788221 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.789555 master-0 kubenswrapper[16352]: I0307 21:31:01.789507 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-error\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.790193 master-0 kubenswrapper[16352]: I0307 21:31:01.790146 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.790193 master-0 kubenswrapper[16352]: I0307 21:31:01.790154 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-session\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.791421 master-0 kubenswrapper[16352]: I0307 21:31:01.791372 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-user-template-login\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.795965 master-0 kubenswrapper[16352]: I0307 21:31:01.795171 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/aff8ea86-e11f-428e-94ee-fef45f3bd856-v4-0-config-system-router-certs\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.795965 master-0 kubenswrapper[16352]: I0307 21:31:01.795367 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.795965 master-0 kubenswrapper[16352]: I0307 21:31:01.795733 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.805712 master-0 kubenswrapper[16352]: I0307 21:31:01.804094 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66bqq\" (UniqueName: \"kubernetes.io/projected/aff8ea86-e11f-428e-94ee-fef45f3bd856-kube-api-access-66bqq\") pod \"oauth-openshift-578bc8c86c-mczhd\" (UID: \"aff8ea86-e11f-428e-94ee-fef45f3bd856\") " pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:01.809709 master-0 kubenswrapper[16352]: I0307 21:31:01.806980 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snghg\" (UniqueName: \"kubernetes.io/projected/13ae001a-27f0-4b15-a204-5eaffc4fd835-kube-api-access-snghg\") pod \"multus-admission-controller-cb4c85d9-8ltxz\" (UID: \"13ae001a-27f0-4b15-a204-5eaffc4fd835\") " pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:01.809709 master-0 kubenswrapper[16352]: I0307 21:31:01.807245 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t9cjt\" (UniqueName: \"kubernetes.io/projected/6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac-kube-api-access-t9cjt\") pod \"prometheus-k8s-0\" (UID: \"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac\") " pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:01.878737 master-0 kubenswrapper[16352]: I0307 21:31:01.878632 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.879005 master-0 kubenswrapper[16352]: I0307 21:31:01.878768 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.879097 master-0 kubenswrapper[16352]: I0307 21:31:01.879012 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.879195 master-0 kubenswrapper[16352]: I0307 21:31:01.879146 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-web-config\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.879195 master-0 kubenswrapper[16352]: I0307 21:31:01.879179 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rrfp5\" (UniqueName: \"kubernetes.io/projected/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-kube-api-access-rrfp5\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879478 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879613 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879618 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-tuning-conf-dir\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879660 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879720 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-ready\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879755 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-config-out\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879789 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6hgx\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-kube-api-access-w6hgx\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879813 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879831 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879900 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879936 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.879980 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.880003 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.880881 master-0 kubenswrapper[16352]: I0307 21:31:01.880521 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ready\" (UniqueName: \"kubernetes.io/empty-dir/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-ready\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.881755 master-0 kubenswrapper[16352]: I0307 21:31:01.881698 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.882455 master-0 kubenswrapper[16352]: I0307 21:31:01.882394 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.885660 master-0 kubenswrapper[16352]: I0307 21:31:01.882902 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-cni-sysctl-allowlist\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.885660 master-0 kubenswrapper[16352]: I0307 21:31:01.883610 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.885660 master-0 kubenswrapper[16352]: I0307 21:31:01.884296 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.886753 master-0 kubenswrapper[16352]: I0307 21:31:01.885832 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-web-config\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.892921 master-0 kubenswrapper[16352]: I0307 21:31:01.892872 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.893321 master-0 kubenswrapper[16352]: I0307 21:31:01.893267 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-tls-assets\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.897944 master-0 kubenswrapper[16352]: I0307 21:31:01.897848 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-config-volume\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.898354 master-0 kubenswrapper[16352]: I0307 21:31:01.898294 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/7cc4efa4-0b49-4490-9334-46c5b516399e-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.899090 master-0 kubenswrapper[16352]: I0307 21:31:01.899030 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/7cc4efa4-0b49-4490-9334-46c5b516399e-config-out\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.930715 master-0 kubenswrapper[16352]: I0307 21:31:01.929918 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6hgx\" (UniqueName: \"kubernetes.io/projected/7cc4efa4-0b49-4490-9334-46c5b516399e-kube-api-access-w6hgx\") pod \"alertmanager-main-0\" (UID: \"7cc4efa4-0b49-4490-9334-46c5b516399e\") " pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:01.948708 master-0 kubenswrapper[16352]: I0307 21:31:01.947301 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrfp5\" (UniqueName: \"kubernetes.io/projected/e151beb8-fcc6-4d9b-a56d-a351f43d9df5-kube-api-access-rrfp5\") pod \"cni-sysctl-allowlist-ds-rhtr2\" (UID: \"e151beb8-fcc6-4d9b-a56d-a351f43d9df5\") " pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:01.994039 master-0 kubenswrapper[16352]: I0307 21:31:01.993957 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:02.010748 master-0 kubenswrapper[16352]: I0307 21:31:02.004357 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:02.050537 master-0 kubenswrapper[16352]: I0307 21:31:02.050457 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" Mar 07 21:31:02.147540 master-0 kubenswrapper[16352]: I0307 21:31:02.147481 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:02.164823 master-0 kubenswrapper[16352]: I0307 21:31:02.163333 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Mar 07 21:31:02.511266 master-0 kubenswrapper[16352]: I0307 21:31:02.511198 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-578bc8c86c-mczhd"] Mar 07 21:31:02.519449 master-0 kubenswrapper[16352]: W0307 21:31:02.518990 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaff8ea86_e11f_428e_94ee_fef45f3bd856.slice/crio-c0491f610cbb70b2aef0a924d6abb98b09baa341eefeff51a6f8952749a740bb WatchSource:0}: Error finding container c0491f610cbb70b2aef0a924d6abb98b09baa341eefeff51a6f8952749a740bb: Status 404 returned error can't find the container with id c0491f610cbb70b2aef0a924d6abb98b09baa341eefeff51a6f8952749a740bb Mar 07 21:31:02.586860 master-0 kubenswrapper[16352]: I0307 21:31:02.586737 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" event={"ID":"aff8ea86-e11f-428e-94ee-fef45f3bd856","Type":"ContainerStarted","Data":"c0491f610cbb70b2aef0a924d6abb98b09baa341eefeff51a6f8952749a740bb"} Mar 07 21:31:02.591441 master-0 kubenswrapper[16352]: I0307 21:31:02.591369 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" event={"ID":"e151beb8-fcc6-4d9b-a56d-a351f43d9df5","Type":"ContainerStarted","Data":"82a2716f09984ce00ea563f405706c814a347a439f4b3562acc27fa0d3420533"} Mar 07 21:31:02.591562 master-0 kubenswrapper[16352]: I0307 21:31:02.591449 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" event={"ID":"e151beb8-fcc6-4d9b-a56d-a351f43d9df5","Type":"ContainerStarted","Data":"67882359b301b64271bf1725f37b20b945e042d29a8d63f7a4d7e15e3f64607e"} Mar 07 21:31:02.593171 master-0 kubenswrapper[16352]: I0307 21:31:02.591782 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:02.617531 master-0 kubenswrapper[16352]: I0307 21:31:02.616966 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Mar 07 21:31:02.623321 master-0 kubenswrapper[16352]: W0307 21:31:02.623157 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f3081b7_fbc7_4a5d_8dc1_d44d84ce1cac.slice/crio-d6d4c9a81c48eb19726214f2ed296ae536cd0f2821275a9a2abe7d19dac9f358 WatchSource:0}: Error finding container d6d4c9a81c48eb19726214f2ed296ae536cd0f2821275a9a2abe7d19dac9f358: Status 404 returned error can't find the container with id d6d4c9a81c48eb19726214f2ed296ae536cd0f2821275a9a2abe7d19dac9f358 Mar 07 21:31:02.624852 master-0 kubenswrapper[16352]: I0307 21:31:02.624734 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" podStartSLOduration=166.624710496 podStartE2EDuration="2m46.624710496s" podCreationTimestamp="2026-03-07 21:28:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:31:02.620422637 +0000 UTC m=+785.691127736" watchObservedRunningTime="2026-03-07 21:31:02.624710496 +0000 UTC m=+785.695415575" Mar 07 21:31:02.657409 master-0 kubenswrapper[16352]: I0307 21:31:02.657110 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-cb4c85d9-8ltxz"] Mar 07 21:31:02.665280 master-0 kubenswrapper[16352]: W0307 21:31:02.665200 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod13ae001a_27f0_4b15_a204_5eaffc4fd835.slice/crio-a0db090a05504c2c6769b66b4a2e322446245eb6c3c60ae88134205637307163 WatchSource:0}: Error finding container a0db090a05504c2c6769b66b4a2e322446245eb6c3c60ae88134205637307163: Status 404 returned error can't find the container with id a0db090a05504c2c6769b66b4a2e322446245eb6c3c60ae88134205637307163 Mar 07 21:31:02.741787 master-0 kubenswrapper[16352]: I0307 21:31:02.741726 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Mar 07 21:31:02.761475 master-0 kubenswrapper[16352]: W0307 21:31:02.761408 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7cc4efa4_0b49_4490_9334_46c5b516399e.slice/crio-35c21c9c30463263e83fe89b6b28bd3046376749b337c232bf63cb91a1770398 WatchSource:0}: Error finding container 35c21c9c30463263e83fe89b6b28bd3046376749b337c232bf63cb91a1770398: Status 404 returned error can't find the container with id 35c21c9c30463263e83fe89b6b28bd3046376749b337c232bf63cb91a1770398 Mar 07 21:31:03.609748 master-0 kubenswrapper[16352]: I0307 21:31:03.609656 16352 generic.go:334] "Generic (PLEG): container finished" podID="6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac" containerID="026392768ae3374c238f656544dc5df103bbffbd4c3c94e9d1512ca61d76fd6f" exitCode=0 Mar 07 21:31:03.610626 master-0 kubenswrapper[16352]: I0307 21:31:03.609822 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerDied","Data":"026392768ae3374c238f656544dc5df103bbffbd4c3c94e9d1512ca61d76fd6f"} Mar 07 21:31:03.610626 master-0 kubenswrapper[16352]: I0307 21:31:03.610550 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"d6d4c9a81c48eb19726214f2ed296ae536cd0f2821275a9a2abe7d19dac9f358"} Mar 07 21:31:03.617923 master-0 kubenswrapper[16352]: I0307 21:31:03.617852 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" event={"ID":"13ae001a-27f0-4b15-a204-5eaffc4fd835","Type":"ContainerStarted","Data":"56f12f2323cac4362a3d484cc9a76634a7aa0d2cfd3bc9722b58c13c41e99344"} Mar 07 21:31:03.618139 master-0 kubenswrapper[16352]: I0307 21:31:03.617975 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" event={"ID":"13ae001a-27f0-4b15-a204-5eaffc4fd835","Type":"ContainerStarted","Data":"99796a93c4ec9032d43a0515d996bd40be4bd01dc4be16678dfe058dc640c77a"} Mar 07 21:31:03.618237 master-0 kubenswrapper[16352]: I0307 21:31:03.618045 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" event={"ID":"13ae001a-27f0-4b15-a204-5eaffc4fd835","Type":"ContainerStarted","Data":"a0db090a05504c2c6769b66b4a2e322446245eb6c3c60ae88134205637307163"} Mar 07 21:31:03.621662 master-0 kubenswrapper[16352]: I0307 21:31:03.621526 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" event={"ID":"aff8ea86-e11f-428e-94ee-fef45f3bd856","Type":"ContainerStarted","Data":"79d69384d3e09d3145bd4fe43bafeb1f319269a207f4bd579acf8b9db33d1182"} Mar 07 21:31:03.621978 master-0 kubenswrapper[16352]: I0307 21:31:03.621934 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:03.630390 master-0 kubenswrapper[16352]: I0307 21:31:03.628891 16352 generic.go:334] "Generic (PLEG): container finished" podID="7cc4efa4-0b49-4490-9334-46c5b516399e" containerID="2b93efa9c72e5a9a2d1edd35c0e8b034a0ae5bdc825a349a10e0b6740215a702" exitCode=0 Mar 07 21:31:03.630390 master-0 kubenswrapper[16352]: I0307 21:31:03.630027 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerDied","Data":"2b93efa9c72e5a9a2d1edd35c0e8b034a0ae5bdc825a349a10e0b6740215a702"} Mar 07 21:31:03.630390 master-0 kubenswrapper[16352]: I0307 21:31:03.630209 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"35c21c9c30463263e83fe89b6b28bd3046376749b337c232bf63cb91a1770398"} Mar 07 21:31:03.632068 master-0 kubenswrapper[16352]: I0307 21:31:03.632008 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" Mar 07 21:31:03.676649 master-0 kubenswrapper[16352]: I0307 21:31:03.676544 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-multus/cni-sysctl-allowlist-ds-rhtr2" Mar 07 21:31:03.739747 master-0 kubenswrapper[16352]: I0307 21:31:03.739618 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-578bc8c86c-mczhd" podStartSLOduration=164.739586088 podStartE2EDuration="2m44.739586088s" podCreationTimestamp="2026-03-07 21:28:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:31:03.733975058 +0000 UTC m=+786.804680167" watchObservedRunningTime="2026-03-07 21:31:03.739586088 +0000 UTC m=+786.810291177" Mar 07 21:31:03.755251 master-0 kubenswrapper[16352]: I0307 21:31:03.749190 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-cb4c85d9-8ltxz" podStartSLOduration=163.749161309 podStartE2EDuration="2m43.749161309s" podCreationTimestamp="2026-03-07 21:28:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:31:03.698362287 +0000 UTC m=+786.769067406" watchObservedRunningTime="2026-03-07 21:31:03.749161309 +0000 UTC m=+786.819866358" Mar 07 21:31:03.820720 master-0 kubenswrapper[16352]: I0307 21:31:03.813437 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:31:03.820720 master-0 kubenswrapper[16352]: I0307 21:31:03.813674 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="multus-admission-controller" containerID="cri-o://932e77a95a266f3a49729a833c9467a215cc08ba8594088722b1b8a34b918e54" gracePeriod=30 Mar 07 21:31:03.820720 master-0 kubenswrapper[16352]: I0307 21:31:03.813862 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="kube-rbac-proxy" containerID="cri-o://95a50f03c73d26087cb603eac561fb93e94820fd631d9ebde8bc2aec42f081ec" gracePeriod=30 Mar 07 21:31:04.643329 master-0 kubenswrapper[16352]: I0307 21:31:04.643220 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"b7d753af4e2457434cfcef9eeb2092527b692ecd2e94f2070a245f39f20d57cd"} Mar 07 21:31:04.643633 master-0 kubenswrapper[16352]: I0307 21:31:04.643343 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"ef9f1267d8be0c19a8f8f7c5872fcddf235673334a9fd0832baab81ff73be1a3"} Mar 07 21:31:04.643633 master-0 kubenswrapper[16352]: I0307 21:31:04.643359 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"c028c09f199a16ef6583b581c44524181963d956ee97e21bcdc93523539e2ba6"} Mar 07 21:31:04.643633 master-0 kubenswrapper[16352]: I0307 21:31:04.643376 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"1444ee0b6f799a0959867293b0217aa2c7abff5e8ce6d5a40eb4d1ade97d0ff1"} Mar 07 21:31:04.643633 master-0 kubenswrapper[16352]: I0307 21:31:04.643387 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"104f27260fbcd8955a17794c2a319d5a33b4d762030ebce7dfbc2ab88570a9d7"} Mar 07 21:31:04.648465 master-0 kubenswrapper[16352]: I0307 21:31:04.648373 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"09942458429b760ce998dec0eac226e175ab56ce723129ffbbaa427ba16ae2e3"} Mar 07 21:31:04.648465 master-0 kubenswrapper[16352]: I0307 21:31:04.648409 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"48e00a30b77e4a6fc11057c15172507aa7ca7ea2b5bbc8fc240530916405349e"} Mar 07 21:31:04.648465 master-0 kubenswrapper[16352]: I0307 21:31:04.648422 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"9b7d3c2b98f0001af9ace16dad359e2b83abae4a24e96f1af2a314f00e677a1d"} Mar 07 21:31:04.648465 master-0 kubenswrapper[16352]: I0307 21:31:04.648432 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"356bd2e9dcb1e757970c5210597887bc383bbc26e5af86c71f87ccaffaaa146d"} Mar 07 21:31:04.652742 master-0 kubenswrapper[16352]: I0307 21:31:04.652560 16352 generic.go:334] "Generic (PLEG): container finished" podID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerID="95a50f03c73d26087cb603eac561fb93e94820fd631d9ebde8bc2aec42f081ec" exitCode=0 Mar 07 21:31:04.654806 master-0 kubenswrapper[16352]: I0307 21:31:04.654741 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerDied","Data":"95a50f03c73d26087cb603eac561fb93e94820fd631d9ebde8bc2aec42f081ec"} Mar 07 21:31:05.680281 master-0 kubenswrapper[16352]: I0307 21:31:05.678466 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"6f3081b7-fbc7-4a5d-8dc1-d44d84ce1cac","Type":"ContainerStarted","Data":"421d4613960695c16487086ceb8949f7599505d54bbf4c21426fb5dd6a99778a"} Mar 07 21:31:05.688042 master-0 kubenswrapper[16352]: I0307 21:31:05.687961 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"0865622303f7c19788e814efd12029df76ddd9fa734df4d3fb694ad42de56e0a"} Mar 07 21:31:05.688122 master-0 kubenswrapper[16352]: I0307 21:31:05.688054 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"7cc4efa4-0b49-4490-9334-46c5b516399e","Type":"ContainerStarted","Data":"8451ea069d1608389e226aa750c9b5bfbbc5562bbd280a1b353498c11563fbb5"} Mar 07 21:31:05.719297 master-0 kubenswrapper[16352]: I0307 21:31:05.719189 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=168.719163666 podStartE2EDuration="2m48.719163666s" podCreationTimestamp="2026-03-07 21:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:31:05.713800632 +0000 UTC m=+788.784505751" watchObservedRunningTime="2026-03-07 21:31:05.719163666 +0000 UTC m=+788.789868735" Mar 07 21:31:05.755956 master-0 kubenswrapper[16352]: I0307 21:31:05.755795 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=168.75570638 podStartE2EDuration="2m48.75570638s" podCreationTimestamp="2026-03-07 21:28:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:31:05.754636955 +0000 UTC m=+788.825342044" watchObservedRunningTime="2026-03-07 21:31:05.75570638 +0000 UTC m=+788.826411469" Mar 07 21:31:07.001518 master-0 kubenswrapper[16352]: I0307 21:31:07.001401 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:31:08.585208 master-0 kubenswrapper[16352]: I0307 21:31:08.585053 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:31:08.591451 master-0 kubenswrapper[16352]: I0307 21:31:08.591381 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:31:33.975430 master-0 kubenswrapper[16352]: I0307 21:31:33.975322 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-56bbfd46b8-6qcf8_ae7ca2b4-ab3c-44f5-b211-f68cd165349d/multus-admission-controller/0.log" Mar 07 21:31:33.977109 master-0 kubenswrapper[16352]: I0307 21:31:33.975443 16352 generic.go:334] "Generic (PLEG): container finished" podID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerID="932e77a95a266f3a49729a833c9467a215cc08ba8594088722b1b8a34b918e54" exitCode=137 Mar 07 21:31:33.977109 master-0 kubenswrapper[16352]: I0307 21:31:33.975506 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerDied","Data":"932e77a95a266f3a49729a833c9467a215cc08ba8594088722b1b8a34b918e54"} Mar 07 21:31:34.666259 master-0 kubenswrapper[16352]: I0307 21:31:34.666187 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-56bbfd46b8-6qcf8_ae7ca2b4-ab3c-44f5-b211-f68cd165349d/multus-admission-controller/0.log" Mar 07 21:31:34.666457 master-0 kubenswrapper[16352]: I0307 21:31:34.666330 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:31:34.699596 master-0 kubenswrapper[16352]: I0307 21:31:34.699516 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrw5d\" (UniqueName: \"kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d\") pod \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " Mar 07 21:31:34.703726 master-0 kubenswrapper[16352]: I0307 21:31:34.703666 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs\") pod \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\" (UID: \"ae7ca2b4-ab3c-44f5-b211-f68cd165349d\") " Mar 07 21:31:34.704613 master-0 kubenswrapper[16352]: I0307 21:31:34.704565 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d" (OuterVolumeSpecName: "kube-api-access-lrw5d") pod "ae7ca2b4-ab3c-44f5-b211-f68cd165349d" (UID: "ae7ca2b4-ab3c-44f5-b211-f68cd165349d"). InnerVolumeSpecName "kube-api-access-lrw5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:31:34.715802 master-0 kubenswrapper[16352]: I0307 21:31:34.709072 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "ae7ca2b4-ab3c-44f5-b211-f68cd165349d" (UID: "ae7ca2b4-ab3c-44f5-b211-f68cd165349d"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:31:34.806192 master-0 kubenswrapper[16352]: I0307 21:31:34.806059 16352 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-webhook-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:31:34.806192 master-0 kubenswrapper[16352]: I0307 21:31:34.806141 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrw5d\" (UniqueName: \"kubernetes.io/projected/ae7ca2b4-ab3c-44f5-b211-f68cd165349d-kube-api-access-lrw5d\") on node \"master-0\" DevicePath \"\"" Mar 07 21:31:34.988858 master-0 kubenswrapper[16352]: I0307 21:31:34.988574 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-admission-controller-56bbfd46b8-6qcf8_ae7ca2b4-ab3c-44f5-b211-f68cd165349d/multus-admission-controller/0.log" Mar 07 21:31:34.988858 master-0 kubenswrapper[16352]: I0307 21:31:34.988711 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" event={"ID":"ae7ca2b4-ab3c-44f5-b211-f68cd165349d","Type":"ContainerDied","Data":"fc13c84fddc8b39e3ae583ab46f78d799260ed9607b8c779835700e3973fc081"} Mar 07 21:31:34.988858 master-0 kubenswrapper[16352]: I0307 21:31:34.988776 16352 scope.go:117] "RemoveContainer" containerID="95a50f03c73d26087cb603eac561fb93e94820fd631d9ebde8bc2aec42f081ec" Mar 07 21:31:34.990557 master-0 kubenswrapper[16352]: I0307 21:31:34.988906 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8" Mar 07 21:31:35.041930 master-0 kubenswrapper[16352]: I0307 21:31:35.020158 16352 scope.go:117] "RemoveContainer" containerID="932e77a95a266f3a49729a833c9467a215cc08ba8594088722b1b8a34b918e54" Mar 07 21:31:35.103461 master-0 kubenswrapper[16352]: I0307 21:31:35.103382 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:31:35.115221 master-0 kubenswrapper[16352]: I0307 21:31:35.115116 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-multus/multus-admission-controller-56bbfd46b8-6qcf8"] Mar 07 21:31:35.204936 master-0 kubenswrapper[16352]: I0307 21:31:35.204850 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" path="/var/lib/kubelet/pods/ae7ca2b4-ab3c-44f5-b211-f68cd165349d/volumes" Mar 07 21:32:02.002555 master-0 kubenswrapper[16352]: I0307 21:32:02.002413 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:32:02.039662 master-0 kubenswrapper[16352]: I0307 21:32:02.039023 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:32:02.335828 master-0 kubenswrapper[16352]: I0307 21:32:02.335655 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Mar 07 21:32:03.049746 master-0 kubenswrapper[16352]: I0307 21:32:03.046721 16352 scope.go:117] "RemoveContainer" containerID="49cc11a235efe78997a02668cffbda8c251aec39c02e2f7118030908ead8c408" Mar 07 21:32:19.234667 master-0 kubenswrapper[16352]: I0307 21:32:19.234594 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: E0307 21:32:19.234948 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="multus-admission-controller" Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: I0307 21:32:19.234961 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="multus-admission-controller" Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: E0307 21:32:19.234986 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="kube-rbac-proxy" Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: I0307 21:32:19.234992 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="kube-rbac-proxy" Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: I0307 21:32:19.235188 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="kube-rbac-proxy" Mar 07 21:32:19.235436 master-0 kubenswrapper[16352]: I0307 21:32:19.235248 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae7ca2b4-ab3c-44f5-b211-f68cd165349d" containerName="multus-admission-controller" Mar 07 21:32:19.235966 master-0 kubenswrapper[16352]: I0307 21:32:19.235939 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.260107 master-0 kubenswrapper[16352]: I0307 21:32:19.260001 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:32:19.355102 master-0 kubenswrapper[16352]: I0307 21:32:19.355013 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355102 master-0 kubenswrapper[16352]: I0307 21:32:19.355093 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-62cnz\" (UniqueName: \"kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355425 master-0 kubenswrapper[16352]: I0307 21:32:19.355149 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355425 master-0 kubenswrapper[16352]: I0307 21:32:19.355195 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355425 master-0 kubenswrapper[16352]: I0307 21:32:19.355234 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355425 master-0 kubenswrapper[16352]: I0307 21:32:19.355302 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.355425 master-0 kubenswrapper[16352]: I0307 21:32:19.355366 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.458054 master-0 kubenswrapper[16352]: I0307 21:32:19.457931 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.458054 master-0 kubenswrapper[16352]: I0307 21:32:19.458024 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.458488 master-0 kubenswrapper[16352]: I0307 21:32:19.458396 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.458607 master-0 kubenswrapper[16352]: I0307 21:32:19.458570 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-62cnz\" (UniqueName: \"kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.458895 master-0 kubenswrapper[16352]: I0307 21:32:19.458835 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.459079 master-0 kubenswrapper[16352]: I0307 21:32:19.459011 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.459201 master-0 kubenswrapper[16352]: I0307 21:32:19.459152 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.459609 master-0 kubenswrapper[16352]: I0307 21:32:19.459542 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.460384 master-0 kubenswrapper[16352]: I0307 21:32:19.460306 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.460569 master-0 kubenswrapper[16352]: I0307 21:32:19.460521 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.462245 master-0 kubenswrapper[16352]: I0307 21:32:19.462160 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.464846 master-0 kubenswrapper[16352]: I0307 21:32:19.464770 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.465308 master-0 kubenswrapper[16352]: I0307 21:32:19.465228 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.483336 master-0 kubenswrapper[16352]: I0307 21:32:19.483264 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-62cnz\" (UniqueName: \"kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz\") pod \"console-6f9c4688bb-5k492\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:19.563161 master-0 kubenswrapper[16352]: I0307 21:32:19.562964 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:20.146757 master-0 kubenswrapper[16352]: I0307 21:32:20.146594 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:32:20.149723 master-0 kubenswrapper[16352]: W0307 21:32:20.149620 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod765298df_6296_4283_a8dc_20135b6765ea.slice/crio-a7832ac75f5c5a132598b682a567a7c8129e13c5265490814e93d667d576ff53 WatchSource:0}: Error finding container a7832ac75f5c5a132598b682a567a7c8129e13c5265490814e93d667d576ff53: Status 404 returned error can't find the container with id a7832ac75f5c5a132598b682a567a7c8129e13c5265490814e93d667d576ff53 Mar 07 21:32:20.486501 master-0 kubenswrapper[16352]: I0307 21:32:20.486239 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9c4688bb-5k492" event={"ID":"765298df-6296-4283-a8dc-20135b6765ea","Type":"ContainerStarted","Data":"12d200dab0789d8cea4ca8b081e47f4b9900b9437626abbfe197ed73e6e18b6d"} Mar 07 21:32:20.486501 master-0 kubenswrapper[16352]: I0307 21:32:20.486433 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9c4688bb-5k492" event={"ID":"765298df-6296-4283-a8dc-20135b6765ea","Type":"ContainerStarted","Data":"a7832ac75f5c5a132598b682a567a7c8129e13c5265490814e93d667d576ff53"} Mar 07 21:32:20.516855 master-0 kubenswrapper[16352]: I0307 21:32:20.516733 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f9c4688bb-5k492" podStartSLOduration=1.5166523889999999 podStartE2EDuration="1.516652389s" podCreationTimestamp="2026-03-07 21:32:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:32:20.508915451 +0000 UTC m=+863.579620520" watchObservedRunningTime="2026-03-07 21:32:20.516652389 +0000 UTC m=+863.587357448" Mar 07 21:32:29.563700 master-0 kubenswrapper[16352]: I0307 21:32:29.563562 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:29.563700 master-0 kubenswrapper[16352]: I0307 21:32:29.563662 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:29.573053 master-0 kubenswrapper[16352]: I0307 21:32:29.572973 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:29.578678 master-0 kubenswrapper[16352]: I0307 21:32:29.578613 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:32:29.686910 master-0 kubenswrapper[16352]: I0307 21:32:29.686825 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:32:54.735336 master-0 kubenswrapper[16352]: I0307 21:32:54.735178 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-64d844fb5f-9b28j" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" containerID="cri-o://e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699" gracePeriod=15 Mar 07 21:32:55.328970 master-0 kubenswrapper[16352]: I0307 21:32:55.328516 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d844fb5f-9b28j_253bb615-1b60-4112-aee8-f572d1c84114/console/1.log" Mar 07 21:32:55.329270 master-0 kubenswrapper[16352]: I0307 21:32:55.329199 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d844fb5f-9b28j_253bb615-1b60-4112-aee8-f572d1c84114/console/0.log" Mar 07 21:32:55.329342 master-0 kubenswrapper[16352]: I0307 21:32:55.329309 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:32:55.399762 master-0 kubenswrapper[16352]: I0307 21:32:55.399595 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400038 master-0 kubenswrapper[16352]: I0307 21:32:55.399820 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400038 master-0 kubenswrapper[16352]: I0307 21:32:55.399892 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400038 master-0 kubenswrapper[16352]: I0307 21:32:55.399936 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xmkfj\" (UniqueName: \"kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400038 master-0 kubenswrapper[16352]: I0307 21:32:55.399989 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400182 master-0 kubenswrapper[16352]: I0307 21:32:55.400073 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.400825 master-0 kubenswrapper[16352]: I0307 21:32:55.400244 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config\") pod \"253bb615-1b60-4112-aee8-f572d1c84114\" (UID: \"253bb615-1b60-4112-aee8-f572d1c84114\") " Mar 07 21:32:55.401133 master-0 kubenswrapper[16352]: I0307 21:32:55.401064 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:32:55.401315 master-0 kubenswrapper[16352]: I0307 21:32:55.401210 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config" (OuterVolumeSpecName: "console-config") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:32:55.401315 master-0 kubenswrapper[16352]: I0307 21:32:55.401295 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca" (OuterVolumeSpecName: "service-ca") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:32:55.401462 master-0 kubenswrapper[16352]: I0307 21:32:55.401330 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:32:55.403959 master-0 kubenswrapper[16352]: I0307 21:32:55.403898 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:32:55.404369 master-0 kubenswrapper[16352]: I0307 21:32:55.404321 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj" (OuterVolumeSpecName: "kube-api-access-xmkfj") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "kube-api-access-xmkfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:32:55.404652 master-0 kubenswrapper[16352]: I0307 21:32:55.404588 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "253bb615-1b60-4112-aee8-f572d1c84114" (UID: "253bb615-1b60-4112-aee8-f572d1c84114"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:32:55.503218 master-0 kubenswrapper[16352]: I0307 21:32:55.503134 16352 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503218 master-0 kubenswrapper[16352]: I0307 21:32:55.503199 16352 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503218 master-0 kubenswrapper[16352]: I0307 21:32:55.503219 16352 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503597 master-0 kubenswrapper[16352]: I0307 21:32:55.503239 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xmkfj\" (UniqueName: \"kubernetes.io/projected/253bb615-1b60-4112-aee8-f572d1c84114-kube-api-access-xmkfj\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503597 master-0 kubenswrapper[16352]: I0307 21:32:55.503257 16352 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/253bb615-1b60-4112-aee8-f572d1c84114-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503597 master-0 kubenswrapper[16352]: I0307 21:32:55.503275 16352 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.503597 master-0 kubenswrapper[16352]: I0307 21:32:55.503294 16352 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/253bb615-1b60-4112-aee8-f572d1c84114-console-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:32:55.855163 master-0 kubenswrapper[16352]: I0307 21:32:55.855053 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d844fb5f-9b28j_253bb615-1b60-4112-aee8-f572d1c84114/console/1.log" Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.855872 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-64d844fb5f-9b28j_253bb615-1b60-4112-aee8-f572d1c84114/console/0.log" Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.855943 16352 generic.go:334] "Generic (PLEG): container finished" podID="253bb615-1b60-4112-aee8-f572d1c84114" containerID="e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699" exitCode=2 Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.855993 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerDied","Data":"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699"} Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.856038 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-64d844fb5f-9b28j" event={"ID":"253bb615-1b60-4112-aee8-f572d1c84114","Type":"ContainerDied","Data":"b6d6be69bca0675d073552dfe02cec1c5e47fac746e07e8d15c549a48ffeea21"} Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.856068 16352 scope.go:117] "RemoveContainer" containerID="e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699" Mar 07 21:32:55.856278 master-0 kubenswrapper[16352]: I0307 21:32:55.856091 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-64d844fb5f-9b28j" Mar 07 21:32:55.884252 master-0 kubenswrapper[16352]: I0307 21:32:55.884154 16352 scope.go:117] "RemoveContainer" containerID="79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" Mar 07 21:32:55.914509 master-0 kubenswrapper[16352]: I0307 21:32:55.914429 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:32:55.919464 master-0 kubenswrapper[16352]: I0307 21:32:55.919399 16352 scope.go:117] "RemoveContainer" containerID="e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699" Mar 07 21:32:55.921257 master-0 kubenswrapper[16352]: E0307 21:32:55.920335 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699\": container with ID starting with e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699 not found: ID does not exist" containerID="e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699" Mar 07 21:32:55.921441 master-0 kubenswrapper[16352]: I0307 21:32:55.921254 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699"} err="failed to get container status \"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699\": rpc error: code = NotFound desc = could not find container \"e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699\": container with ID starting with e117d5873391bdc041ac70b850174646f8b913dbb187280f674b0b3881d78699 not found: ID does not exist" Mar 07 21:32:55.921441 master-0 kubenswrapper[16352]: I0307 21:32:55.921308 16352 scope.go:117] "RemoveContainer" containerID="79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" Mar 07 21:32:55.922194 master-0 kubenswrapper[16352]: E0307 21:32:55.922129 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371\": container with ID starting with 79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371 not found: ID does not exist" containerID="79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371" Mar 07 21:32:55.922194 master-0 kubenswrapper[16352]: I0307 21:32:55.922162 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371"} err="failed to get container status \"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371\": rpc error: code = NotFound desc = could not find container \"79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371\": container with ID starting with 79e4c3e71a2027afb0f87ab2cceca499afb5cdf193cf563e1becb71a4ad3f371 not found: ID does not exist" Mar 07 21:32:55.933333 master-0 kubenswrapper[16352]: I0307 21:32:55.933203 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-64d844fb5f-9b28j"] Mar 07 21:32:57.206231 master-0 kubenswrapper[16352]: I0307 21:32:57.206056 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="253bb615-1b60-4112-aee8-f572d1c84114" path="/var/lib/kubelet/pods/253bb615-1b60-4112-aee8-f572d1c84114/volumes" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.918376 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: E0307 21:32:59.918899 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.918923 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: E0307 21:32:59.918988 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.919000 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.919367 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.919408 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="253bb615-1b60-4112-aee8-f572d1c84114" containerName="console" Mar 07 21:32:59.920714 master-0 kubenswrapper[16352]: I0307 21:32:59.920261 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:32:59.923357 master-0 kubenswrapper[16352]: I0307 21:32:59.922882 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"openshift-service-ca.crt" Mar 07 21:32:59.923357 master-0 kubenswrapper[16352]: I0307 21:32:59.923309 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"kube-root-ca.crt" Mar 07 21:32:59.925752 master-0 kubenswrapper[16352]: I0307 21:32:59.923776 16352 reflector.go:368] Caches populated for *v1.Secret from object-"sushy-emulator"/"os-client-config" Mar 07 21:32:59.925752 master-0 kubenswrapper[16352]: I0307 21:32:59.924595 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"sushy-emulator"/"sushy-emulator-config" Mar 07 21:32:59.936640 master-0 kubenswrapper[16352]: I0307 21:32:59.936553 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:32:59.995004 master-0 kubenswrapper[16352]: I0307 21:32:59.994898 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:32:59.995266 master-0 kubenswrapper[16352]: I0307 21:32:59.995148 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-66kln\" (UniqueName: \"kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:32:59.995266 master-0 kubenswrapper[16352]: I0307 21:32:59.995196 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.097825 master-0 kubenswrapper[16352]: I0307 21:33:00.097737 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.098191 master-0 kubenswrapper[16352]: I0307 21:33:00.097866 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-66kln\" (UniqueName: \"kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.098191 master-0 kubenswrapper[16352]: I0307 21:33:00.097913 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.099913 master-0 kubenswrapper[16352]: I0307 21:33:00.098847 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.103773 master-0 kubenswrapper[16352]: I0307 21:33:00.103652 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.119402 master-0 kubenswrapper[16352]: I0307 21:33:00.119327 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-66kln\" (UniqueName: \"kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln\") pod \"sushy-emulator-78f6d7d749-xgc79\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.281324 master-0 kubenswrapper[16352]: I0307 21:33:00.281135 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:00.818949 master-0 kubenswrapper[16352]: I0307 21:33:00.818846 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:33:00.825304 master-0 kubenswrapper[16352]: I0307 21:33:00.825229 16352 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:33:00.914626 master-0 kubenswrapper[16352]: I0307 21:33:00.914477 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" event={"ID":"56c0a57c-e9dd-4f2a-8e20-045a2ca28321","Type":"ContainerStarted","Data":"afcb17a3a3f3d74ea9ce256878a7ddc858abc2f1ff611c99ad43b4f92ca74f13"} Mar 07 21:33:07.992799 master-0 kubenswrapper[16352]: I0307 21:33:07.992672 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" event={"ID":"56c0a57c-e9dd-4f2a-8e20-045a2ca28321","Type":"ContainerStarted","Data":"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926"} Mar 07 21:33:08.023067 master-0 kubenswrapper[16352]: I0307 21:33:08.022898 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" podStartSLOduration=2.474570629 podStartE2EDuration="9.022870399s" podCreationTimestamp="2026-03-07 21:32:59 +0000 UTC" firstStartedPulling="2026-03-07 21:33:00.825083809 +0000 UTC m=+903.895788898" lastFinishedPulling="2026-03-07 21:33:07.373383569 +0000 UTC m=+910.444088668" observedRunningTime="2026-03-07 21:33:08.018177581 +0000 UTC m=+911.088882680" watchObservedRunningTime="2026-03-07 21:33:08.022870399 +0000 UTC m=+911.093575498" Mar 07 21:33:10.281934 master-0 kubenswrapper[16352]: I0307 21:33:10.281860 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:10.281934 master-0 kubenswrapper[16352]: I0307 21:33:10.281936 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:10.297766 master-0 kubenswrapper[16352]: I0307 21:33:10.297668 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:11.026406 master-0 kubenswrapper[16352]: I0307 21:33:11.026325 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:33:30.774383 master-0 kubenswrapper[16352]: I0307 21:33:30.774273 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd"] Mar 07 21:33:30.776958 master-0 kubenswrapper[16352]: I0307 21:33:30.776891 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.786807 master-0 kubenswrapper[16352]: I0307 21:33:30.786712 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd"] Mar 07 21:33:30.859079 master-0 kubenswrapper[16352]: I0307 21:33:30.858962 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/69f11e2a-533f-4fc5-bb24-bec63952f40a-os-client-config\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.859386 master-0 kubenswrapper[16352]: I0307 21:33:30.859276 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg6cs\" (UniqueName: \"kubernetes.io/projected/69f11e2a-533f-4fc5-bb24-bec63952f40a-kube-api-access-qg6cs\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.960852 master-0 kubenswrapper[16352]: I0307 21:33:30.960727 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/69f11e2a-533f-4fc5-bb24-bec63952f40a-os-client-config\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.961276 master-0 kubenswrapper[16352]: I0307 21:33:30.960996 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg6cs\" (UniqueName: \"kubernetes.io/projected/69f11e2a-533f-4fc5-bb24-bec63952f40a-kube-api-access-qg6cs\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.966892 master-0 kubenswrapper[16352]: I0307 21:33:30.966809 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/69f11e2a-533f-4fc5-bb24-bec63952f40a-os-client-config\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:30.983484 master-0 kubenswrapper[16352]: I0307 21:33:30.983409 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg6cs\" (UniqueName: \"kubernetes.io/projected/69f11e2a-533f-4fc5-bb24-bec63952f40a-kube-api-access-qg6cs\") pod \"nova-console-poller-849dd7bd7c-wlzjd\" (UID: \"69f11e2a-533f-4fc5-bb24-bec63952f40a\") " pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:31.112655 master-0 kubenswrapper[16352]: I0307 21:33:31.112595 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" Mar 07 21:33:31.670858 master-0 kubenswrapper[16352]: I0307 21:33:31.670750 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd"] Mar 07 21:33:31.671852 master-0 kubenswrapper[16352]: W0307 21:33:31.671763 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69f11e2a_533f_4fc5_bb24_bec63952f40a.slice/crio-c51188a197baf492d6f4fa39e86554e9d8e1eb9620c8af6969bc78285212386a WatchSource:0}: Error finding container c51188a197baf492d6f4fa39e86554e9d8e1eb9620c8af6969bc78285212386a: Status 404 returned error can't find the container with id c51188a197baf492d6f4fa39e86554e9d8e1eb9620c8af6969bc78285212386a Mar 07 21:33:32.247402 master-0 kubenswrapper[16352]: I0307 21:33:32.247305 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" event={"ID":"69f11e2a-533f-4fc5-bb24-bec63952f40a","Type":"ContainerStarted","Data":"c51188a197baf492d6f4fa39e86554e9d8e1eb9620c8af6969bc78285212386a"} Mar 07 21:33:37.303210 master-0 kubenswrapper[16352]: I0307 21:33:37.303120 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" event={"ID":"69f11e2a-533f-4fc5-bb24-bec63952f40a","Type":"ContainerStarted","Data":"0ffb0c20548d917279de57d16c52b25d161ba2a0259f0236ff50ada56a62adba"} Mar 07 21:33:38.316289 master-0 kubenswrapper[16352]: I0307 21:33:38.316198 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" event={"ID":"69f11e2a-533f-4fc5-bb24-bec63952f40a","Type":"ContainerStarted","Data":"7655215d875751fdbdef1f110f8d86d5357a26127f4c797202f6eeb4693a5810"} Mar 07 21:33:38.359254 master-0 kubenswrapper[16352]: I0307 21:33:38.359129 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-poller-849dd7bd7c-wlzjd" podStartSLOduration=2.292522276 podStartE2EDuration="8.359100564s" podCreationTimestamp="2026-03-07 21:33:30 +0000 UTC" firstStartedPulling="2026-03-07 21:33:31.679767284 +0000 UTC m=+934.750472383" lastFinishedPulling="2026-03-07 21:33:37.746345562 +0000 UTC m=+940.817050671" observedRunningTime="2026-03-07 21:33:38.347454814 +0000 UTC m=+941.418159903" watchObservedRunningTime="2026-03-07 21:33:38.359100564 +0000 UTC m=+941.429805633" Mar 07 21:34:02.374256 master-0 kubenswrapper[16352]: I0307 21:34:02.374146 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/nova-console-recorder-6bd67877d9-cd76q"] Mar 07 21:34:02.376739 master-0 kubenswrapper[16352]: I0307 21:34:02.376659 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.385363 master-0 kubenswrapper[16352]: I0307 21:34:02.385258 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6bd67877d9-cd76q"] Mar 07 21:34:02.527335 master-0 kubenswrapper[16352]: I0307 21:34:02.527225 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/6546464c-60a5-4eba-beda-b2fa413f9808-nova-console-recordings-pv\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.527335 master-0 kubenswrapper[16352]: I0307 21:34:02.527319 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xzlw7\" (UniqueName: \"kubernetes.io/projected/6546464c-60a5-4eba-beda-b2fa413f9808-kube-api-access-xzlw7\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.527829 master-0 kubenswrapper[16352]: I0307 21:34:02.527381 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/6546464c-60a5-4eba-beda-b2fa413f9808-os-client-config\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.630206 master-0 kubenswrapper[16352]: I0307 21:34:02.629819 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/6546464c-60a5-4eba-beda-b2fa413f9808-os-client-config\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.630559 master-0 kubenswrapper[16352]: I0307 21:34:02.630386 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/6546464c-60a5-4eba-beda-b2fa413f9808-nova-console-recordings-pv\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.630617 master-0 kubenswrapper[16352]: I0307 21:34:02.630565 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xzlw7\" (UniqueName: \"kubernetes.io/projected/6546464c-60a5-4eba-beda-b2fa413f9808-kube-api-access-xzlw7\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.637158 master-0 kubenswrapper[16352]: I0307 21:34:02.637087 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/6546464c-60a5-4eba-beda-b2fa413f9808-os-client-config\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:02.649082 master-0 kubenswrapper[16352]: I0307 21:34:02.649016 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xzlw7\" (UniqueName: \"kubernetes.io/projected/6546464c-60a5-4eba-beda-b2fa413f9808-kube-api-access-xzlw7\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:03.327470 master-0 kubenswrapper[16352]: I0307 21:34:03.327377 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-console-recordings-pv\" (UniqueName: \"kubernetes.io/nfs/6546464c-60a5-4eba-beda-b2fa413f9808-nova-console-recordings-pv\") pod \"nova-console-recorder-6bd67877d9-cd76q\" (UID: \"6546464c-60a5-4eba-beda-b2fa413f9808\") " pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:03.613831 master-0 kubenswrapper[16352]: I0307 21:34:03.613768 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" Mar 07 21:34:04.191056 master-0 kubenswrapper[16352]: I0307 21:34:04.188369 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/nova-console-recorder-6bd67877d9-cd76q"] Mar 07 21:34:04.202249 master-0 kubenswrapper[16352]: W0307 21:34:04.202123 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6546464c_60a5_4eba_beda_b2fa413f9808.slice/crio-eaf66417539636181521aa7f6480f74a4f3f2b18792044b02577d97e532c683a WatchSource:0}: Error finding container eaf66417539636181521aa7f6480f74a4f3f2b18792044b02577d97e532c683a: Status 404 returned error can't find the container with id eaf66417539636181521aa7f6480f74a4f3f2b18792044b02577d97e532c683a Mar 07 21:34:04.600219 master-0 kubenswrapper[16352]: I0307 21:34:04.600136 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" event={"ID":"6546464c-60a5-4eba-beda-b2fa413f9808","Type":"ContainerStarted","Data":"eaf66417539636181521aa7f6480f74a4f3f2b18792044b02577d97e532c683a"} Mar 07 21:34:14.686255 master-0 kubenswrapper[16352]: I0307 21:34:14.686075 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" event={"ID":"6546464c-60a5-4eba-beda-b2fa413f9808","Type":"ContainerStarted","Data":"e327ae89fc59a26108fcde4f1dd9400aaf4eac75006d42316f3a5ceaa68cf1a8"} Mar 07 21:34:14.686255 master-0 kubenswrapper[16352]: I0307 21:34:14.686188 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" event={"ID":"6546464c-60a5-4eba-beda-b2fa413f9808","Type":"ContainerStarted","Data":"1a2568d751e5450ca876d755031cffabec49c2f73a1a45a294d95ba1ef21f913"} Mar 07 21:34:14.716357 master-0 kubenswrapper[16352]: I0307 21:34:14.716238 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/nova-console-recorder-6bd67877d9-cd76q" podStartSLOduration=2.494561734 podStartE2EDuration="12.71620473s" podCreationTimestamp="2026-03-07 21:34:02 +0000 UTC" firstStartedPulling="2026-03-07 21:34:04.206652723 +0000 UTC m=+967.277357822" lastFinishedPulling="2026-03-07 21:34:14.428295759 +0000 UTC m=+977.499000818" observedRunningTime="2026-03-07 21:34:14.70864854 +0000 UTC m=+977.779353629" watchObservedRunningTime="2026-03-07 21:34:14.71620473 +0000 UTC m=+977.786909809" Mar 07 21:34:41.624134 master-0 kubenswrapper[16352]: I0307 21:34:41.623971 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk"] Mar 07 21:34:41.627473 master-0 kubenswrapper[16352]: I0307 21:34:41.627420 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.743341 master-0 kubenswrapper[16352]: I0307 21:34:41.743227 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.743737 master-0 kubenswrapper[16352]: I0307 21:34:41.743383 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h677w\" (UniqueName: \"kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.743737 master-0 kubenswrapper[16352]: I0307 21:34:41.743455 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.845863 master-0 kubenswrapper[16352]: I0307 21:34:41.845765 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.845863 master-0 kubenswrapper[16352]: I0307 21:34:41.845869 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h677w\" (UniqueName: \"kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.846589 master-0 kubenswrapper[16352]: I0307 21:34:41.846151 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.847070 master-0 kubenswrapper[16352]: I0307 21:34:41.846998 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.847534 master-0 kubenswrapper[16352]: I0307 21:34:41.847477 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.875136 master-0 kubenswrapper[16352]: I0307 21:34:41.874925 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk"] Mar 07 21:34:41.942766 master-0 kubenswrapper[16352]: I0307 21:34:41.942360 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h677w\" (UniqueName: \"kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w\") pod \"7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:41.944753 master-0 kubenswrapper[16352]: I0307 21:34:41.944701 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:42.447308 master-0 kubenswrapper[16352]: I0307 21:34:42.447249 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk"] Mar 07 21:34:42.983799 master-0 kubenswrapper[16352]: I0307 21:34:42.983585 16352 generic.go:334] "Generic (PLEG): container finished" podID="2913f373-9a41-4285-945f-a73e51124073" containerID="169ea6d034910b431ec4fa7dfdca3386bfb79d78bc4a0d85ec400e55b93f4703" exitCode=0 Mar 07 21:34:42.984737 master-0 kubenswrapper[16352]: I0307 21:34:42.983792 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" event={"ID":"2913f373-9a41-4285-945f-a73e51124073","Type":"ContainerDied","Data":"169ea6d034910b431ec4fa7dfdca3386bfb79d78bc4a0d85ec400e55b93f4703"} Mar 07 21:34:42.984737 master-0 kubenswrapper[16352]: I0307 21:34:42.983862 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" event={"ID":"2913f373-9a41-4285-945f-a73e51124073","Type":"ContainerStarted","Data":"d432b98365cacc19b3b6e95a96e23c51456adf016dafab1724ba3a0729a01a14"} Mar 07 21:34:45.007306 master-0 kubenswrapper[16352]: I0307 21:34:45.007250 16352 generic.go:334] "Generic (PLEG): container finished" podID="2913f373-9a41-4285-945f-a73e51124073" containerID="e35672f96d4a191d4650de56828f503268090de668d811d812a886ac51f3b000" exitCode=0 Mar 07 21:34:45.008204 master-0 kubenswrapper[16352]: I0307 21:34:45.007323 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" event={"ID":"2913f373-9a41-4285-945f-a73e51124073","Type":"ContainerDied","Data":"e35672f96d4a191d4650de56828f503268090de668d811d812a886ac51f3b000"} Mar 07 21:34:46.020049 master-0 kubenswrapper[16352]: I0307 21:34:46.019962 16352 generic.go:334] "Generic (PLEG): container finished" podID="2913f373-9a41-4285-945f-a73e51124073" containerID="45957161b34f07d4260dbc363e4169aa6415f36670237fdf4526097e9fd295a6" exitCode=0 Mar 07 21:34:46.020049 master-0 kubenswrapper[16352]: I0307 21:34:46.020046 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" event={"ID":"2913f373-9a41-4285-945f-a73e51124073","Type":"ContainerDied","Data":"45957161b34f07d4260dbc363e4169aa6415f36670237fdf4526097e9fd295a6"} Mar 07 21:34:47.460647 master-0 kubenswrapper[16352]: I0307 21:34:47.460547 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:47.567724 master-0 kubenswrapper[16352]: I0307 21:34:47.567585 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h677w\" (UniqueName: \"kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w\") pod \"2913f373-9a41-4285-945f-a73e51124073\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " Mar 07 21:34:47.568222 master-0 kubenswrapper[16352]: I0307 21:34:47.567933 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle\") pod \"2913f373-9a41-4285-945f-a73e51124073\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " Mar 07 21:34:47.568222 master-0 kubenswrapper[16352]: I0307 21:34:47.568028 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util\") pod \"2913f373-9a41-4285-945f-a73e51124073\" (UID: \"2913f373-9a41-4285-945f-a73e51124073\") " Mar 07 21:34:47.569828 master-0 kubenswrapper[16352]: I0307 21:34:47.569146 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle" (OuterVolumeSpecName: "bundle") pod "2913f373-9a41-4285-945f-a73e51124073" (UID: "2913f373-9a41-4285-945f-a73e51124073"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:34:47.575139 master-0 kubenswrapper[16352]: I0307 21:34:47.575077 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w" (OuterVolumeSpecName: "kube-api-access-h677w") pod "2913f373-9a41-4285-945f-a73e51124073" (UID: "2913f373-9a41-4285-945f-a73e51124073"). InnerVolumeSpecName "kube-api-access-h677w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:34:47.601281 master-0 kubenswrapper[16352]: I0307 21:34:47.601081 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util" (OuterVolumeSpecName: "util") pod "2913f373-9a41-4285-945f-a73e51124073" (UID: "2913f373-9a41-4285-945f-a73e51124073"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:34:47.671078 master-0 kubenswrapper[16352]: I0307 21:34:47.670846 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:34:47.671078 master-0 kubenswrapper[16352]: I0307 21:34:47.670908 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2913f373-9a41-4285-945f-a73e51124073-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:34:47.671078 master-0 kubenswrapper[16352]: I0307 21:34:47.670920 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h677w\" (UniqueName: \"kubernetes.io/projected/2913f373-9a41-4285-945f-a73e51124073-kube-api-access-h677w\") on node \"master-0\" DevicePath \"\"" Mar 07 21:34:48.046574 master-0 kubenswrapper[16352]: I0307 21:34:48.046308 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" event={"ID":"2913f373-9a41-4285-945f-a73e51124073","Type":"ContainerDied","Data":"d432b98365cacc19b3b6e95a96e23c51456adf016dafab1724ba3a0729a01a14"} Mar 07 21:34:48.046574 master-0 kubenswrapper[16352]: I0307 21:34:48.046393 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d432b98365cacc19b3b6e95a96e23c51456adf016dafab1724ba3a0729a01a14" Mar 07 21:34:48.046574 master-0 kubenswrapper[16352]: I0307 21:34:48.046416 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/7f6062bfcf66f08711c4d599873349559e66916847a22b4b74a32f97d4dkttk" Mar 07 21:34:53.727041 master-0 kubenswrapper[16352]: I0307 21:34:53.726901 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/lvms-operator-cc6c44d98-tvcmb"] Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: E0307 21:34:53.727582 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="util" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: I0307 21:34:53.727611 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="util" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: E0307 21:34:53.727696 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="pull" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: I0307 21:34:53.727708 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="pull" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: E0307 21:34:53.727735 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="extract" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: I0307 21:34:53.727747 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="extract" Mar 07 21:34:53.728079 master-0 kubenswrapper[16352]: I0307 21:34:53.728002 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2913f373-9a41-4285-945f-a73e51124073" containerName="extract" Mar 07 21:34:53.728975 master-0 kubenswrapper[16352]: I0307 21:34:53.728939 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.732895 master-0 kubenswrapper[16352]: I0307 21:34:53.732855 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"kube-root-ca.crt" Mar 07 21:34:53.733205 master-0 kubenswrapper[16352]: I0307 21:34:53.732914 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-metrics-cert" Mar 07 21:34:53.733205 master-0 kubenswrapper[16352]: I0307 21:34:53.732914 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-service-cert" Mar 07 21:34:53.733318 master-0 kubenswrapper[16352]: I0307 21:34:53.733303 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-storage"/"openshift-service-ca.crt" Mar 07 21:34:53.733553 master-0 kubenswrapper[16352]: I0307 21:34:53.733518 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"lvms-operator-webhook-server-cert" Mar 07 21:34:53.754801 master-0 kubenswrapper[16352]: I0307 21:34:53.754713 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-cc6c44d98-tvcmb"] Mar 07 21:34:53.794714 master-0 kubenswrapper[16352]: I0307 21:34:53.794635 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-webhook-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.794978 master-0 kubenswrapper[16352]: I0307 21:34:53.794900 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-apiservice-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.795029 master-0 kubenswrapper[16352]: I0307 21:34:53.795003 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-metrics-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.795130 master-0 kubenswrapper[16352]: I0307 21:34:53.795095 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/95600a53-981e-42a6-a95f-bf7b533d0cc1-socket-dir\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.795174 master-0 kubenswrapper[16352]: I0307 21:34:53.795156 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sll7f\" (UniqueName: \"kubernetes.io/projected/95600a53-981e-42a6-a95f-bf7b533d0cc1-kube-api-access-sll7f\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.897648 master-0 kubenswrapper[16352]: I0307 21:34:53.897485 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-webhook-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.897648 master-0 kubenswrapper[16352]: I0307 21:34:53.897641 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-apiservice-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.897648 master-0 kubenswrapper[16352]: I0307 21:34:53.897700 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-metrics-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.898080 master-0 kubenswrapper[16352]: I0307 21:34:53.897931 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/95600a53-981e-42a6-a95f-bf7b533d0cc1-socket-dir\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.898080 master-0 kubenswrapper[16352]: I0307 21:34:53.898054 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sll7f\" (UniqueName: \"kubernetes.io/projected/95600a53-981e-42a6-a95f-bf7b533d0cc1-kube-api-access-sll7f\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.899946 master-0 kubenswrapper[16352]: I0307 21:34:53.898792 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/empty-dir/95600a53-981e-42a6-a95f-bf7b533d0cc1-socket-dir\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.901252 master-0 kubenswrapper[16352]: I0307 21:34:53.901212 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-apiservice-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.902711 master-0 kubenswrapper[16352]: I0307 21:34:53.902645 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-webhook-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.904703 master-0 kubenswrapper[16352]: I0307 21:34:53.904650 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/95600a53-981e-42a6-a95f-bf7b533d0cc1-metrics-cert\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:53.916933 master-0 kubenswrapper[16352]: I0307 21:34:53.916864 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sll7f\" (UniqueName: \"kubernetes.io/projected/95600a53-981e-42a6-a95f-bf7b533d0cc1-kube-api-access-sll7f\") pod \"lvms-operator-cc6c44d98-tvcmb\" (UID: \"95600a53-981e-42a6-a95f-bf7b533d0cc1\") " pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:54.046324 master-0 kubenswrapper[16352]: I0307 21:34:54.046117 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:34:54.468116 master-0 kubenswrapper[16352]: I0307 21:34:54.468034 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/lvms-operator-cc6c44d98-tvcmb"] Mar 07 21:34:55.110639 master-0 kubenswrapper[16352]: I0307 21:34:55.110566 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" event={"ID":"95600a53-981e-42a6-a95f-bf7b533d0cc1","Type":"ContainerStarted","Data":"ed2d9105ad28036bf613536a566f0924983fd54f1f001b077db74662ec561a83"} Mar 07 21:35:00.155568 master-0 kubenswrapper[16352]: I0307 21:35:00.155410 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" event={"ID":"95600a53-981e-42a6-a95f-bf7b533d0cc1","Type":"ContainerStarted","Data":"62da646d537055e248537c41913d4c8d6a5d959790b26f297b9e08bd3e4ca976"} Mar 07 21:35:00.156855 master-0 kubenswrapper[16352]: I0307 21:35:00.155912 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:35:00.162358 master-0 kubenswrapper[16352]: I0307 21:35:00.162290 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" Mar 07 21:35:00.189056 master-0 kubenswrapper[16352]: I0307 21:35:00.188920 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/lvms-operator-cc6c44d98-tvcmb" podStartSLOduration=1.9062903709999999 podStartE2EDuration="7.188891166s" podCreationTimestamp="2026-03-07 21:34:53 +0000 UTC" firstStartedPulling="2026-03-07 21:34:54.477586649 +0000 UTC m=+1017.548291718" lastFinishedPulling="2026-03-07 21:34:59.760187454 +0000 UTC m=+1022.830892513" observedRunningTime="2026-03-07 21:35:00.182395001 +0000 UTC m=+1023.253100110" watchObservedRunningTime="2026-03-07 21:35:00.188891166 +0000 UTC m=+1023.259596245" Mar 07 21:35:04.266711 master-0 kubenswrapper[16352]: I0307 21:35:04.266578 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg"] Mar 07 21:35:04.269654 master-0 kubenswrapper[16352]: I0307 21:35:04.269603 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.293508 master-0 kubenswrapper[16352]: I0307 21:35:04.293416 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg"] Mar 07 21:35:04.430505 master-0 kubenswrapper[16352]: I0307 21:35:04.430402 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.430891 master-0 kubenswrapper[16352]: I0307 21:35:04.430649 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.431054 master-0 kubenswrapper[16352]: I0307 21:35:04.430882 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9mhc\" (UniqueName: \"kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.532670 master-0 kubenswrapper[16352]: I0307 21:35:04.532508 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.532916 master-0 kubenswrapper[16352]: I0307 21:35:04.532762 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9mhc\" (UniqueName: \"kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.532916 master-0 kubenswrapper[16352]: I0307 21:35:04.532880 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.533475 master-0 kubenswrapper[16352]: I0307 21:35:04.533410 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.533626 master-0 kubenswrapper[16352]: I0307 21:35:04.533588 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.555316 master-0 kubenswrapper[16352]: I0307 21:35:04.555214 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9mhc\" (UniqueName: \"kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc\") pod \"925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:04.598509 master-0 kubenswrapper[16352]: I0307 21:35:04.598420 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:05.091804 master-0 kubenswrapper[16352]: I0307 21:35:05.091750 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg"] Mar 07 21:35:05.206500 master-0 kubenswrapper[16352]: I0307 21:35:05.206429 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" event={"ID":"9b4e61c6-94ff-4048-8e7f-9c15844e3f09","Type":"ContainerStarted","Data":"ab27865261c3f85d42d57ba4643df1750e10e1fcea8b574168f22a05f58e06da"} Mar 07 21:35:06.222482 master-0 kubenswrapper[16352]: I0307 21:35:06.222384 16352 generic.go:334] "Generic (PLEG): container finished" podID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerID="240db0a886fbf1ecf759d776568298c5ec56dfcc725d87dc5e80a1849c592580" exitCode=0 Mar 07 21:35:06.222482 master-0 kubenswrapper[16352]: I0307 21:35:06.222460 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" event={"ID":"9b4e61c6-94ff-4048-8e7f-9c15844e3f09","Type":"ContainerDied","Data":"240db0a886fbf1ecf759d776568298c5ec56dfcc725d87dc5e80a1849c592580"} Mar 07 21:35:06.289115 master-0 kubenswrapper[16352]: I0307 21:35:06.289032 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb"] Mar 07 21:35:06.290832 master-0 kubenswrapper[16352]: I0307 21:35:06.290782 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.302206 master-0 kubenswrapper[16352]: I0307 21:35:06.302089 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb"] Mar 07 21:35:06.378969 master-0 kubenswrapper[16352]: I0307 21:35:06.378873 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fl5d\" (UniqueName: \"kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.379274 master-0 kubenswrapper[16352]: I0307 21:35:06.379224 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.379569 master-0 kubenswrapper[16352]: I0307 21:35:06.379511 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.482202 master-0 kubenswrapper[16352]: I0307 21:35:06.482000 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.482547 master-0 kubenswrapper[16352]: I0307 21:35:06.482408 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fl5d\" (UniqueName: \"kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.483186 master-0 kubenswrapper[16352]: I0307 21:35:06.482918 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.483186 master-0 kubenswrapper[16352]: I0307 21:35:06.483055 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.486270 master-0 kubenswrapper[16352]: I0307 21:35:06.483571 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.506105 master-0 kubenswrapper[16352]: I0307 21:35:06.506019 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fl5d\" (UniqueName: \"kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d\") pod \"d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:06.607830 master-0 kubenswrapper[16352]: I0307 21:35:06.607652 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:07.171639 master-0 kubenswrapper[16352]: I0307 21:35:07.171559 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb"] Mar 07 21:35:07.233759 master-0 kubenswrapper[16352]: I0307 21:35:07.233629 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" event={"ID":"c6a129f3-7280-4285-b02e-3c16b99e8db1","Type":"ContainerStarted","Data":"4208db5b2d35bc54cda8ebf967f90f8df48fddbb3c4ac0221f576c07d83e1535"} Mar 07 21:35:07.687938 master-0 kubenswrapper[16352]: I0307 21:35:07.680917 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns"] Mar 07 21:35:07.687938 master-0 kubenswrapper[16352]: I0307 21:35:07.687848 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.691735 master-0 kubenswrapper[16352]: I0307 21:35:07.691591 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns"] Mar 07 21:35:07.809188 master-0 kubenswrapper[16352]: I0307 21:35:07.809075 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wflx4\" (UniqueName: \"kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.809459 master-0 kubenswrapper[16352]: I0307 21:35:07.809274 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.809459 master-0 kubenswrapper[16352]: I0307 21:35:07.809403 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.910994 master-0 kubenswrapper[16352]: I0307 21:35:07.910897 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.912052 master-0 kubenswrapper[16352]: I0307 21:35:07.911981 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wflx4\" (UniqueName: \"kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.912391 master-0 kubenswrapper[16352]: I0307 21:35:07.911993 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.912464 master-0 kubenswrapper[16352]: I0307 21:35:07.912441 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.913019 master-0 kubenswrapper[16352]: I0307 21:35:07.912973 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:07.934545 master-0 kubenswrapper[16352]: I0307 21:35:07.934471 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wflx4\" (UniqueName: \"kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4\") pod \"0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:08.027149 master-0 kubenswrapper[16352]: I0307 21:35:08.027007 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:08.253294 master-0 kubenswrapper[16352]: I0307 21:35:08.253214 16352 generic.go:334] "Generic (PLEG): container finished" podID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerID="c636e20e1b6ebcbf1a3977073312800f27a6ec375cf3dbebcc9b7d8dc2f06dac" exitCode=0 Mar 07 21:35:08.253942 master-0 kubenswrapper[16352]: I0307 21:35:08.253286 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" event={"ID":"c6a129f3-7280-4285-b02e-3c16b99e8db1","Type":"ContainerDied","Data":"c636e20e1b6ebcbf1a3977073312800f27a6ec375cf3dbebcc9b7d8dc2f06dac"} Mar 07 21:35:08.496063 master-0 kubenswrapper[16352]: I0307 21:35:08.495993 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns"] Mar 07 21:35:08.498653 master-0 kubenswrapper[16352]: W0307 21:35:08.498004 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d38fbf2_deb6_4766_8b6f_871253b22b82.slice/crio-5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7 WatchSource:0}: Error finding container 5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7: Status 404 returned error can't find the container with id 5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7 Mar 07 21:35:09.267401 master-0 kubenswrapper[16352]: I0307 21:35:09.267326 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerStarted","Data":"1f3a5ddf4bbe6bda8846a7f1ed94e9c48dfc770d1b99793e6e955af6fc2e3bc5"} Mar 07 21:35:09.268164 master-0 kubenswrapper[16352]: I0307 21:35:09.268146 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerStarted","Data":"5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7"} Mar 07 21:35:10.285781 master-0 kubenswrapper[16352]: I0307 21:35:10.285674 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerID="1f3a5ddf4bbe6bda8846a7f1ed94e9c48dfc770d1b99793e6e955af6fc2e3bc5" exitCode=0 Mar 07 21:35:10.286444 master-0 kubenswrapper[16352]: I0307 21:35:10.285727 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerDied","Data":"1f3a5ddf4bbe6bda8846a7f1ed94e9c48dfc770d1b99793e6e955af6fc2e3bc5"} Mar 07 21:35:10.290818 master-0 kubenswrapper[16352]: I0307 21:35:10.290766 16352 generic.go:334] "Generic (PLEG): container finished" podID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerID="5fd7aa728dca5c3b21476ca823cdd6fa530fd26c8da612f2b9e812021a430deb" exitCode=0 Mar 07 21:35:10.290929 master-0 kubenswrapper[16352]: I0307 21:35:10.290851 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" event={"ID":"9b4e61c6-94ff-4048-8e7f-9c15844e3f09","Type":"ContainerDied","Data":"5fd7aa728dca5c3b21476ca823cdd6fa530fd26c8da612f2b9e812021a430deb"} Mar 07 21:35:11.304277 master-0 kubenswrapper[16352]: I0307 21:35:11.304098 16352 generic.go:334] "Generic (PLEG): container finished" podID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerID="025a6c71145bc4e504cb8ae19fa8dc64004831dbed299305de577939328d2fd1" exitCode=0 Mar 07 21:35:11.304277 master-0 kubenswrapper[16352]: I0307 21:35:11.304167 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" event={"ID":"c6a129f3-7280-4285-b02e-3c16b99e8db1","Type":"ContainerDied","Data":"025a6c71145bc4e504cb8ae19fa8dc64004831dbed299305de577939328d2fd1"} Mar 07 21:35:11.309190 master-0 kubenswrapper[16352]: I0307 21:35:11.309123 16352 generic.go:334] "Generic (PLEG): container finished" podID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerID="e1df9b9199d3bc79ab17051c1c61054aca511046af10bfc78289327d5739dc86" exitCode=0 Mar 07 21:35:11.309319 master-0 kubenswrapper[16352]: I0307 21:35:11.309221 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" event={"ID":"9b4e61c6-94ff-4048-8e7f-9c15844e3f09","Type":"ContainerDied","Data":"e1df9b9199d3bc79ab17051c1c61054aca511046af10bfc78289327d5739dc86"} Mar 07 21:35:12.276365 master-0 kubenswrapper[16352]: I0307 21:35:12.276110 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w"] Mar 07 21:35:12.278149 master-0 kubenswrapper[16352]: I0307 21:35:12.278116 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.295189 master-0 kubenswrapper[16352]: I0307 21:35:12.295101 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w"] Mar 07 21:35:12.336482 master-0 kubenswrapper[16352]: I0307 21:35:12.336399 16352 generic.go:334] "Generic (PLEG): container finished" podID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerID="1c085af3308de23c7765befb49e6b958a6e4ea8510a6f400bcc9a11e8699b0b4" exitCode=0 Mar 07 21:35:12.338568 master-0 kubenswrapper[16352]: I0307 21:35:12.336461 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" event={"ID":"c6a129f3-7280-4285-b02e-3c16b99e8db1","Type":"ContainerDied","Data":"1c085af3308de23c7765befb49e6b958a6e4ea8510a6f400bcc9a11e8699b0b4"} Mar 07 21:35:12.346512 master-0 kubenswrapper[16352]: I0307 21:35:12.344858 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerID="10f8f6fc740febc2be5f24132d42fb8a8d81083bfe586453178c35a8b8d4a264" exitCode=0 Mar 07 21:35:12.346512 master-0 kubenswrapper[16352]: I0307 21:35:12.344949 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerDied","Data":"10f8f6fc740febc2be5f24132d42fb8a8d81083bfe586453178c35a8b8d4a264"} Mar 07 21:35:12.408189 master-0 kubenswrapper[16352]: I0307 21:35:12.408105 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5ppq2\" (UniqueName: \"kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.408369 master-0 kubenswrapper[16352]: I0307 21:35:12.408276 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.408458 master-0 kubenswrapper[16352]: I0307 21:35:12.408417 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.510787 master-0 kubenswrapper[16352]: I0307 21:35:12.509654 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5ppq2\" (UniqueName: \"kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.510787 master-0 kubenswrapper[16352]: I0307 21:35:12.509785 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.510787 master-0 kubenswrapper[16352]: I0307 21:35:12.510085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.510787 master-0 kubenswrapper[16352]: I0307 21:35:12.510425 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.510787 master-0 kubenswrapper[16352]: I0307 21:35:12.510547 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.543303 master-0 kubenswrapper[16352]: I0307 21:35:12.539668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5ppq2\" (UniqueName: \"kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2\") pod \"98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.670861 master-0 kubenswrapper[16352]: I0307 21:35:12.666773 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:12.784143 master-0 kubenswrapper[16352]: I0307 21:35:12.783822 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:12.877840 master-0 kubenswrapper[16352]: I0307 21:35:12.877757 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:35:12.878324 master-0 kubenswrapper[16352]: E0307 21:35:12.878301 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="extract" Mar 07 21:35:12.878324 master-0 kubenswrapper[16352]: I0307 21:35:12.878325 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="extract" Mar 07 21:35:12.878410 master-0 kubenswrapper[16352]: E0307 21:35:12.878349 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="pull" Mar 07 21:35:12.878410 master-0 kubenswrapper[16352]: I0307 21:35:12.878357 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="pull" Mar 07 21:35:12.878410 master-0 kubenswrapper[16352]: E0307 21:35:12.878384 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="util" Mar 07 21:35:12.878410 master-0 kubenswrapper[16352]: I0307 21:35:12.878393 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="util" Mar 07 21:35:12.878586 master-0 kubenswrapper[16352]: I0307 21:35:12.878561 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b4e61c6-94ff-4048-8e7f-9c15844e3f09" containerName="extract" Mar 07 21:35:12.884465 master-0 kubenswrapper[16352]: I0307 21:35:12.884411 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:12.889231 master-0 kubenswrapper[16352]: I0307 21:35:12.888944 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:35:12.924477 master-0 kubenswrapper[16352]: I0307 21:35:12.923549 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle\") pod \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " Mar 07 21:35:12.924477 master-0 kubenswrapper[16352]: I0307 21:35:12.923934 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9mhc\" (UniqueName: \"kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc\") pod \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " Mar 07 21:35:12.924477 master-0 kubenswrapper[16352]: I0307 21:35:12.924017 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util\") pod \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\" (UID: \"9b4e61c6-94ff-4048-8e7f-9c15844e3f09\") " Mar 07 21:35:12.925315 master-0 kubenswrapper[16352]: I0307 21:35:12.925243 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle" (OuterVolumeSpecName: "bundle") pod "9b4e61c6-94ff-4048-8e7f-9c15844e3f09" (UID: "9b4e61c6-94ff-4048-8e7f-9c15844e3f09"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:12.938544 master-0 kubenswrapper[16352]: I0307 21:35:12.933999 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc" (OuterVolumeSpecName: "kube-api-access-v9mhc") pod "9b4e61c6-94ff-4048-8e7f-9c15844e3f09" (UID: "9b4e61c6-94ff-4048-8e7f-9c15844e3f09"). InnerVolumeSpecName "kube-api-access-v9mhc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:35:12.938544 master-0 kubenswrapper[16352]: I0307 21:35:12.938440 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util" (OuterVolumeSpecName: "util") pod "9b4e61c6-94ff-4048-8e7f-9c15844e3f09" (UID: "9b4e61c6-94ff-4048-8e7f-9c15844e3f09"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:13.026510 master-0 kubenswrapper[16352]: I0307 21:35:13.026370 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.026510 master-0 kubenswrapper[16352]: I0307 21:35:13.026454 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.026510 master-0 kubenswrapper[16352]: I0307 21:35:13.026487 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.026962 master-0 kubenswrapper[16352]: I0307 21:35:13.026855 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mdh29\" (UniqueName: \"kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.027295 master-0 kubenswrapper[16352]: I0307 21:35:13.027216 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.027424 master-0 kubenswrapper[16352]: I0307 21:35:13.027382 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.027632 master-0 kubenswrapper[16352]: I0307 21:35:13.027590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.027889 master-0 kubenswrapper[16352]: I0307 21:35:13.027846 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:13.027889 master-0 kubenswrapper[16352]: I0307 21:35:13.027881 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9mhc\" (UniqueName: \"kubernetes.io/projected/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-kube-api-access-v9mhc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:13.028003 master-0 kubenswrapper[16352]: I0307 21:35:13.027903 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/9b4e61c6-94ff-4048-8e7f-9c15844e3f09-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:13.130034 master-0 kubenswrapper[16352]: I0307 21:35:13.129904 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mdh29\" (UniqueName: \"kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130279 master-0 kubenswrapper[16352]: I0307 21:35:13.130218 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130333 master-0 kubenswrapper[16352]: I0307 21:35:13.130285 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130378 master-0 kubenswrapper[16352]: I0307 21:35:13.130354 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130431 master-0 kubenswrapper[16352]: I0307 21:35:13.130396 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130476 master-0 kubenswrapper[16352]: I0307 21:35:13.130443 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.130520 master-0 kubenswrapper[16352]: I0307 21:35:13.130472 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.132857 master-0 kubenswrapper[16352]: I0307 21:35:13.132794 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.133894 master-0 kubenswrapper[16352]: I0307 21:35:13.132953 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.133894 master-0 kubenswrapper[16352]: I0307 21:35:13.133450 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.133894 master-0 kubenswrapper[16352]: I0307 21:35:13.133489 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.136041 master-0 kubenswrapper[16352]: I0307 21:35:13.135985 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.136181 master-0 kubenswrapper[16352]: I0307 21:35:13.136034 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.149404 master-0 kubenswrapper[16352]: I0307 21:35:13.149349 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mdh29\" (UniqueName: \"kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29\") pod \"console-6594fcb745-7lf8n\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.248793 master-0 kubenswrapper[16352]: I0307 21:35:13.248720 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w"] Mar 07 21:35:13.254389 master-0 kubenswrapper[16352]: W0307 21:35:13.254323 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5aa8fd8_22b6_445d_a222_d089d6c20d64.slice/crio-1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c WatchSource:0}: Error finding container 1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c: Status 404 returned error can't find the container with id 1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c Mar 07 21:35:13.273372 master-0 kubenswrapper[16352]: I0307 21:35:13.273324 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:13.363148 master-0 kubenswrapper[16352]: I0307 21:35:13.363089 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" event={"ID":"b5aa8fd8-22b6-445d-a222-d089d6c20d64","Type":"ContainerStarted","Data":"1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c"} Mar 07 21:35:13.380157 master-0 kubenswrapper[16352]: I0307 21:35:13.370436 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" event={"ID":"9b4e61c6-94ff-4048-8e7f-9c15844e3f09","Type":"ContainerDied","Data":"ab27865261c3f85d42d57ba4643df1750e10e1fcea8b574168f22a05f58e06da"} Mar 07 21:35:13.380157 master-0 kubenswrapper[16352]: I0307 21:35:13.370475 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ab27865261c3f85d42d57ba4643df1750e10e1fcea8b574168f22a05f58e06da" Mar 07 21:35:13.380157 master-0 kubenswrapper[16352]: I0307 21:35:13.370619 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/925ad1f05bf386dc21bdfe2f8249c1fbfd04a404dec7a7fb6362d758e5c9dmg" Mar 07 21:35:13.380157 master-0 kubenswrapper[16352]: I0307 21:35:13.377442 16352 generic.go:334] "Generic (PLEG): container finished" podID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerID="1df269899649af33182fa20312f26edd1e0f203ec4666e83dbb05a4135cc3ca0" exitCode=0 Mar 07 21:35:13.380157 master-0 kubenswrapper[16352]: I0307 21:35:13.378025 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerDied","Data":"1df269899649af33182fa20312f26edd1e0f203ec4666e83dbb05a4135cc3ca0"} Mar 07 21:35:13.799370 master-0 kubenswrapper[16352]: I0307 21:35:13.799326 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:13.868348 master-0 kubenswrapper[16352]: I0307 21:35:13.868269 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:35:13.953974 master-0 kubenswrapper[16352]: I0307 21:35:13.953875 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle\") pod \"c6a129f3-7280-4285-b02e-3c16b99e8db1\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " Mar 07 21:35:13.954078 master-0 kubenswrapper[16352]: I0307 21:35:13.954050 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util\") pod \"c6a129f3-7280-4285-b02e-3c16b99e8db1\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " Mar 07 21:35:13.954150 master-0 kubenswrapper[16352]: I0307 21:35:13.954113 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5fl5d\" (UniqueName: \"kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d\") pod \"c6a129f3-7280-4285-b02e-3c16b99e8db1\" (UID: \"c6a129f3-7280-4285-b02e-3c16b99e8db1\") " Mar 07 21:35:13.955635 master-0 kubenswrapper[16352]: I0307 21:35:13.955586 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle" (OuterVolumeSpecName: "bundle") pod "c6a129f3-7280-4285-b02e-3c16b99e8db1" (UID: "c6a129f3-7280-4285-b02e-3c16b99e8db1"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:13.957811 master-0 kubenswrapper[16352]: I0307 21:35:13.957755 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d" (OuterVolumeSpecName: "kube-api-access-5fl5d") pod "c6a129f3-7280-4285-b02e-3c16b99e8db1" (UID: "c6a129f3-7280-4285-b02e-3c16b99e8db1"). InnerVolumeSpecName "kube-api-access-5fl5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:35:13.970280 master-0 kubenswrapper[16352]: I0307 21:35:13.964150 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util" (OuterVolumeSpecName: "util") pod "c6a129f3-7280-4285-b02e-3c16b99e8db1" (UID: "c6a129f3-7280-4285-b02e-3c16b99e8db1"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:14.057062 master-0 kubenswrapper[16352]: I0307 21:35:14.056936 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:14.057062 master-0 kubenswrapper[16352]: I0307 21:35:14.057006 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c6a129f3-7280-4285-b02e-3c16b99e8db1-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:14.057062 master-0 kubenswrapper[16352]: I0307 21:35:14.057025 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5fl5d\" (UniqueName: \"kubernetes.io/projected/c6a129f3-7280-4285-b02e-3c16b99e8db1-kube-api-access-5fl5d\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:14.390209 master-0 kubenswrapper[16352]: I0307 21:35:14.390162 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" Mar 07 21:35:14.391016 master-0 kubenswrapper[16352]: I0307 21:35:14.390162 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/d146760600e43041070ad4572d9c23f31a62e3aefc01a54998863bc5f4s97zb" event={"ID":"c6a129f3-7280-4285-b02e-3c16b99e8db1","Type":"ContainerDied","Data":"4208db5b2d35bc54cda8ebf967f90f8df48fddbb3c4ac0221f576c07d83e1535"} Mar 07 21:35:14.391070 master-0 kubenswrapper[16352]: I0307 21:35:14.391048 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4208db5b2d35bc54cda8ebf967f90f8df48fddbb3c4ac0221f576c07d83e1535" Mar 07 21:35:14.392891 master-0 kubenswrapper[16352]: I0307 21:35:14.392825 16352 generic.go:334] "Generic (PLEG): container finished" podID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerID="438f7039465b447e826a87338bfc45dd7b8cf705d94cd3faca9e30c300993b36" exitCode=0 Mar 07 21:35:14.392945 master-0 kubenswrapper[16352]: I0307 21:35:14.392896 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" event={"ID":"b5aa8fd8-22b6-445d-a222-d089d6c20d64","Type":"ContainerDied","Data":"438f7039465b447e826a87338bfc45dd7b8cf705d94cd3faca9e30c300993b36"} Mar 07 21:35:14.398074 master-0 kubenswrapper[16352]: I0307 21:35:14.398002 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6594fcb745-7lf8n" event={"ID":"f2cb093b-e5fc-4408-8fdf-8b72dfc80385","Type":"ContainerStarted","Data":"246b22ec02344c71757e8b8be4f003ccb468a1824adbbd2e754cfa9692d708d1"} Mar 07 21:35:14.398133 master-0 kubenswrapper[16352]: I0307 21:35:14.398090 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6594fcb745-7lf8n" event={"ID":"f2cb093b-e5fc-4408-8fdf-8b72dfc80385","Type":"ContainerStarted","Data":"876fc01ff43d1820c5b0772826f314253e844a17904f05511b33f8c9626b2f29"} Mar 07 21:35:14.466136 master-0 kubenswrapper[16352]: I0307 21:35:14.465970 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6594fcb745-7lf8n" podStartSLOduration=2.465927865 podStartE2EDuration="2.465927865s" podCreationTimestamp="2026-03-07 21:35:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:35:14.453339676 +0000 UTC m=+1037.524044805" watchObservedRunningTime="2026-03-07 21:35:14.465927865 +0000 UTC m=+1037.536632994" Mar 07 21:35:14.752063 master-0 kubenswrapper[16352]: I0307 21:35:14.752004 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:14.881266 master-0 kubenswrapper[16352]: I0307 21:35:14.881173 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wflx4\" (UniqueName: \"kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4\") pod \"1d38fbf2-deb6-4766-8b6f-871253b22b82\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " Mar 07 21:35:14.881596 master-0 kubenswrapper[16352]: I0307 21:35:14.881475 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util\") pod \"1d38fbf2-deb6-4766-8b6f-871253b22b82\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " Mar 07 21:35:14.881596 master-0 kubenswrapper[16352]: I0307 21:35:14.881561 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle\") pod \"1d38fbf2-deb6-4766-8b6f-871253b22b82\" (UID: \"1d38fbf2-deb6-4766-8b6f-871253b22b82\") " Mar 07 21:35:14.882671 master-0 kubenswrapper[16352]: I0307 21:35:14.882624 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle" (OuterVolumeSpecName: "bundle") pod "1d38fbf2-deb6-4766-8b6f-871253b22b82" (UID: "1d38fbf2-deb6-4766-8b6f-871253b22b82"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:14.882959 master-0 kubenswrapper[16352]: I0307 21:35:14.882918 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:14.887320 master-0 kubenswrapper[16352]: I0307 21:35:14.887236 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4" (OuterVolumeSpecName: "kube-api-access-wflx4") pod "1d38fbf2-deb6-4766-8b6f-871253b22b82" (UID: "1d38fbf2-deb6-4766-8b6f-871253b22b82"). InnerVolumeSpecName "kube-api-access-wflx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:35:14.899467 master-0 kubenswrapper[16352]: I0307 21:35:14.898473 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util" (OuterVolumeSpecName: "util") pod "1d38fbf2-deb6-4766-8b6f-871253b22b82" (UID: "1d38fbf2-deb6-4766-8b6f-871253b22b82"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:14.984895 master-0 kubenswrapper[16352]: I0307 21:35:14.984727 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d38fbf2-deb6-4766-8b6f-871253b22b82-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:14.984895 master-0 kubenswrapper[16352]: I0307 21:35:14.984780 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wflx4\" (UniqueName: \"kubernetes.io/projected/1d38fbf2-deb6-4766-8b6f-871253b22b82-kube-api-access-wflx4\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:15.414082 master-0 kubenswrapper[16352]: I0307 21:35:15.414011 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" Mar 07 21:35:15.414950 master-0 kubenswrapper[16352]: I0307 21:35:15.414044 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/0e94e7566f739476ccec6d16e58de3f1c434cfa3060893f90f3e473a82tb4ns" event={"ID":"1d38fbf2-deb6-4766-8b6f-871253b22b82","Type":"ContainerDied","Data":"5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7"} Mar 07 21:35:15.414950 master-0 kubenswrapper[16352]: I0307 21:35:15.414189 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5c808e8a11f2b6d6c705ec911e6289e6b95282d0fcd4976039d57ad25c06bef7" Mar 07 21:35:16.056028 master-0 kubenswrapper[16352]: E0307 21:35:16.055824 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5aa8fd8_22b6_445d_a222_d089d6c20d64.slice/crio-405af1e7e9800234b3b137ed4a1b433e4292c80914fc7a1bc75c020acf50d095.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:35:16.451707 master-0 kubenswrapper[16352]: I0307 21:35:16.445649 16352 generic.go:334] "Generic (PLEG): container finished" podID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerID="405af1e7e9800234b3b137ed4a1b433e4292c80914fc7a1bc75c020acf50d095" exitCode=0 Mar 07 21:35:16.451707 master-0 kubenswrapper[16352]: I0307 21:35:16.445730 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" event={"ID":"b5aa8fd8-22b6-445d-a222-d089d6c20d64","Type":"ContainerDied","Data":"405af1e7e9800234b3b137ed4a1b433e4292c80914fc7a1bc75c020acf50d095"} Mar 07 21:35:17.458379 master-0 kubenswrapper[16352]: I0307 21:35:17.458290 16352 generic.go:334] "Generic (PLEG): container finished" podID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerID="d6f07f39f03b210b6979768e2fa7faf3c2693fa0854d1198292b2942e3fe84c8" exitCode=0 Mar 07 21:35:17.458379 master-0 kubenswrapper[16352]: I0307 21:35:17.458375 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" event={"ID":"b5aa8fd8-22b6-445d-a222-d089d6c20d64","Type":"ContainerDied","Data":"d6f07f39f03b210b6979768e2fa7faf3c2693fa0854d1198292b2942e3fe84c8"} Mar 07 21:35:18.848515 master-0 kubenswrapper[16352]: I0307 21:35:18.848415 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:18.974320 master-0 kubenswrapper[16352]: I0307 21:35:18.974136 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5ppq2\" (UniqueName: \"kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2\") pod \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " Mar 07 21:35:18.974320 master-0 kubenswrapper[16352]: I0307 21:35:18.974263 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util\") pod \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " Mar 07 21:35:18.974320 master-0 kubenswrapper[16352]: I0307 21:35:18.974359 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle\") pod \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\" (UID: \"b5aa8fd8-22b6-445d-a222-d089d6c20d64\") " Mar 07 21:35:18.978320 master-0 kubenswrapper[16352]: I0307 21:35:18.978247 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle" (OuterVolumeSpecName: "bundle") pod "b5aa8fd8-22b6-445d-a222-d089d6c20d64" (UID: "b5aa8fd8-22b6-445d-a222-d089d6c20d64"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:18.979192 master-0 kubenswrapper[16352]: I0307 21:35:18.979137 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2" (OuterVolumeSpecName: "kube-api-access-5ppq2") pod "b5aa8fd8-22b6-445d-a222-d089d6c20d64" (UID: "b5aa8fd8-22b6-445d-a222-d089d6c20d64"). InnerVolumeSpecName "kube-api-access-5ppq2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:35:18.992892 master-0 kubenswrapper[16352]: I0307 21:35:18.992797 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util" (OuterVolumeSpecName: "util") pod "b5aa8fd8-22b6-445d-a222-d089d6c20d64" (UID: "b5aa8fd8-22b6-445d-a222-d089d6c20d64"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:35:19.077500 master-0 kubenswrapper[16352]: I0307 21:35:19.077317 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5ppq2\" (UniqueName: \"kubernetes.io/projected/b5aa8fd8-22b6-445d-a222-d089d6c20d64-kube-api-access-5ppq2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:19.077500 master-0 kubenswrapper[16352]: I0307 21:35:19.077386 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:19.077500 master-0 kubenswrapper[16352]: I0307 21:35:19.077401 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b5aa8fd8-22b6-445d-a222-d089d6c20d64-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:19.478836 master-0 kubenswrapper[16352]: I0307 21:35:19.477377 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" event={"ID":"b5aa8fd8-22b6-445d-a222-d089d6c20d64","Type":"ContainerDied","Data":"1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c"} Mar 07 21:35:19.478836 master-0 kubenswrapper[16352]: I0307 21:35:19.477443 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1551e6c66ebd5819d8551f368ab8a834b7058d6bf542b30e4cb808fb3082700c" Mar 07 21:35:19.478836 master-0 kubenswrapper[16352]: I0307 21:35:19.477504 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/98629960b44b381d1a86cff1d1439a8df43509c9ad24579158c59d0f0822r9w" Mar 07 21:35:19.957549 master-0 kubenswrapper[16352]: I0307 21:35:19.957437 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g"] Mar 07 21:35:19.958808 master-0 kubenswrapper[16352]: E0307 21:35:19.958754 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="util" Mar 07 21:35:19.958979 master-0 kubenswrapper[16352]: I0307 21:35:19.958878 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="util" Mar 07 21:35:19.959052 master-0 kubenswrapper[16352]: E0307 21:35:19.958987 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="util" Mar 07 21:35:19.959119 master-0 kubenswrapper[16352]: I0307 21:35:19.959056 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="util" Mar 07 21:35:19.959184 master-0 kubenswrapper[16352]: E0307 21:35:19.959087 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="extract" Mar 07 21:35:19.959287 master-0 kubenswrapper[16352]: I0307 21:35:19.959245 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="extract" Mar 07 21:35:19.959435 master-0 kubenswrapper[16352]: E0307 21:35:19.959375 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="pull" Mar 07 21:35:19.959523 master-0 kubenswrapper[16352]: I0307 21:35:19.959451 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="pull" Mar 07 21:35:19.959591 master-0 kubenswrapper[16352]: E0307 21:35:19.959489 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="extract" Mar 07 21:35:19.959591 master-0 kubenswrapper[16352]: I0307 21:35:19.959552 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="extract" Mar 07 21:35:19.959747 master-0 kubenswrapper[16352]: E0307 21:35:19.959585 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="extract" Mar 07 21:35:19.959747 master-0 kubenswrapper[16352]: I0307 21:35:19.959648 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="extract" Mar 07 21:35:19.959902 master-0 kubenswrapper[16352]: E0307 21:35:19.959762 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="pull" Mar 07 21:35:19.959902 master-0 kubenswrapper[16352]: I0307 21:35:19.959788 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="pull" Mar 07 21:35:19.960026 master-0 kubenswrapper[16352]: E0307 21:35:19.959876 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="util" Mar 07 21:35:19.960026 master-0 kubenswrapper[16352]: I0307 21:35:19.959952 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="util" Mar 07 21:35:19.960147 master-0 kubenswrapper[16352]: E0307 21:35:19.959989 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="pull" Mar 07 21:35:19.960147 master-0 kubenswrapper[16352]: I0307 21:35:19.960059 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="pull" Mar 07 21:35:19.960875 master-0 kubenswrapper[16352]: I0307 21:35:19.960763 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d38fbf2-deb6-4766-8b6f-871253b22b82" containerName="extract" Mar 07 21:35:19.960994 master-0 kubenswrapper[16352]: I0307 21:35:19.960873 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5aa8fd8-22b6-445d-a222-d089d6c20d64" containerName="extract" Mar 07 21:35:19.960994 master-0 kubenswrapper[16352]: I0307 21:35:19.960955 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6a129f3-7280-4285-b02e-3c16b99e8db1" containerName="extract" Mar 07 21:35:19.963288 master-0 kubenswrapper[16352]: I0307 21:35:19.963230 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:19.966174 master-0 kubenswrapper[16352]: I0307 21:35:19.966092 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"openshift-service-ca.crt" Mar 07 21:35:19.967045 master-0 kubenswrapper[16352]: I0307 21:35:19.966981 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager-operator"/"kube-root-ca.crt" Mar 07 21:35:20.059333 master-0 kubenswrapper[16352]: I0307 21:35:20.059242 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g"] Mar 07 21:35:20.100442 master-0 kubenswrapper[16352]: I0307 21:35:20.096782 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbgpw\" (UniqueName: \"kubernetes.io/projected/22634e6e-01eb-4c26-a75a-2b2699562790-kube-api-access-dbgpw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.100442 master-0 kubenswrapper[16352]: I0307 21:35:20.096995 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22634e6e-01eb-4c26-a75a-2b2699562790-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.198932 master-0 kubenswrapper[16352]: I0307 21:35:20.198866 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22634e6e-01eb-4c26-a75a-2b2699562790-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.199207 master-0 kubenswrapper[16352]: I0307 21:35:20.198977 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbgpw\" (UniqueName: \"kubernetes.io/projected/22634e6e-01eb-4c26-a75a-2b2699562790-kube-api-access-dbgpw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.200115 master-0 kubenswrapper[16352]: I0307 21:35:20.200042 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/22634e6e-01eb-4c26-a75a-2b2699562790-tmp\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.238941 master-0 kubenswrapper[16352]: I0307 21:35:20.238785 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbgpw\" (UniqueName: \"kubernetes.io/projected/22634e6e-01eb-4c26-a75a-2b2699562790-kube-api-access-dbgpw\") pod \"cert-manager-operator-controller-manager-66c8bdd694-76d6g\" (UID: \"22634e6e-01eb-4c26-a75a-2b2699562790\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.290139 master-0 kubenswrapper[16352]: I0307 21:35:20.290076 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" Mar 07 21:35:20.770116 master-0 kubenswrapper[16352]: I0307 21:35:20.770056 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g"] Mar 07 21:35:20.771973 master-0 kubenswrapper[16352]: W0307 21:35:20.771921 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22634e6e_01eb_4c26_a75a_2b2699562790.slice/crio-f098fed75430c43961fb84f404066bcf0c517fe1b2962b6dba41e0efb65a441e WatchSource:0}: Error finding container f098fed75430c43961fb84f404066bcf0c517fe1b2962b6dba41e0efb65a441e: Status 404 returned error can't find the container with id f098fed75430c43961fb84f404066bcf0c517fe1b2962b6dba41e0efb65a441e Mar 07 21:35:21.519572 master-0 kubenswrapper[16352]: I0307 21:35:21.519486 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" event={"ID":"22634e6e-01eb-4c26-a75a-2b2699562790","Type":"ContainerStarted","Data":"f098fed75430c43961fb84f404066bcf0c517fe1b2962b6dba41e0efb65a441e"} Mar 07 21:35:23.291912 master-0 kubenswrapper[16352]: I0307 21:35:23.277303 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:23.291912 master-0 kubenswrapper[16352]: I0307 21:35:23.277378 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:23.291912 master-0 kubenswrapper[16352]: I0307 21:35:23.285757 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:23.548497 master-0 kubenswrapper[16352]: I0307 21:35:23.548355 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:35:23.625991 master-0 kubenswrapper[16352]: I0307 21:35:23.625917 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:35:24.556460 master-0 kubenswrapper[16352]: I0307 21:35:24.556346 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" event={"ID":"22634e6e-01eb-4c26-a75a-2b2699562790","Type":"ContainerStarted","Data":"8d04d816b7b5813e4dc4eb412691077ffac4287ff0677e9c759b8cd4fd2bab9d"} Mar 07 21:35:30.429826 master-0 kubenswrapper[16352]: I0307 21:35:30.429659 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-66c8bdd694-76d6g" podStartSLOduration=7.8329786729999995 podStartE2EDuration="11.429614809s" podCreationTimestamp="2026-03-07 21:35:19 +0000 UTC" firstStartedPulling="2026-03-07 21:35:20.775332544 +0000 UTC m=+1043.846037603" lastFinishedPulling="2026-03-07 21:35:24.37196868 +0000 UTC m=+1047.442673739" observedRunningTime="2026-03-07 21:35:24.636448222 +0000 UTC m=+1047.707153291" watchObservedRunningTime="2026-03-07 21:35:30.429614809 +0000 UTC m=+1053.500319908" Mar 07 21:35:30.443411 master-0 kubenswrapper[16352]: I0307 21:35:30.443304 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vmqtf"] Mar 07 21:35:30.444805 master-0 kubenswrapper[16352]: I0307 21:35:30.444773 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.450620 master-0 kubenswrapper[16352]: I0307 21:35:30.447994 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 07 21:35:30.467404 master-0 kubenswrapper[16352]: I0307 21:35:30.467301 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 07 21:35:30.467404 master-0 kubenswrapper[16352]: I0307 21:35:30.468991 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gq27h\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-kube-api-access-gq27h\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.467404 master-0 kubenswrapper[16352]: I0307 21:35:30.469140 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.472565 master-0 kubenswrapper[16352]: I0307 21:35:30.472509 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vmqtf"] Mar 07 21:35:30.572199 master-0 kubenswrapper[16352]: I0307 21:35:30.572132 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.572542 master-0 kubenswrapper[16352]: I0307 21:35:30.572227 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gq27h\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-kube-api-access-gq27h\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.602233 master-0 kubenswrapper[16352]: I0307 21:35:30.601273 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-bound-sa-token\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.611866 master-0 kubenswrapper[16352]: I0307 21:35:30.603497 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gq27h\" (UniqueName: \"kubernetes.io/projected/126fe395-9c81-464a-b5f7-a68c43e4fa53-kube-api-access-gq27h\") pod \"cert-manager-webhook-6888856db4-vmqtf\" (UID: \"126fe395-9c81-464a-b5f7-a68c43e4fa53\") " pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:30.784398 master-0 kubenswrapper[16352]: I0307 21:35:30.784207 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:31.285529 master-0 kubenswrapper[16352]: I0307 21:35:31.285431 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-6888856db4-vmqtf"] Mar 07 21:35:31.286250 master-0 kubenswrapper[16352]: W0307 21:35:31.286182 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod126fe395_9c81_464a_b5f7_a68c43e4fa53.slice/crio-734f302290ad9de8644b03e2b0602434c8f23448d6ba2572cba38a56daf5502d WatchSource:0}: Error finding container 734f302290ad9de8644b03e2b0602434c8f23448d6ba2572cba38a56daf5502d: Status 404 returned error can't find the container with id 734f302290ad9de8644b03e2b0602434c8f23448d6ba2572cba38a56daf5502d Mar 07 21:35:31.622389 master-0 kubenswrapper[16352]: I0307 21:35:31.622288 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" event={"ID":"126fe395-9c81-464a-b5f7-a68c43e4fa53","Type":"ContainerStarted","Data":"734f302290ad9de8644b03e2b0602434c8f23448d6ba2572cba38a56daf5502d"} Mar 07 21:35:31.632861 master-0 kubenswrapper[16352]: I0307 21:35:31.632771 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-p74j2"] Mar 07 21:35:31.634600 master-0 kubenswrapper[16352]: I0307 21:35:31.634527 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.664042 master-0 kubenswrapper[16352]: I0307 21:35:31.663950 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-p74j2"] Mar 07 21:35:31.696189 master-0 kubenswrapper[16352]: I0307 21:35:31.696107 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t4wj\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-kube-api-access-6t4wj\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.696189 master-0 kubenswrapper[16352]: I0307 21:35:31.696179 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.797493 master-0 kubenswrapper[16352]: I0307 21:35:31.797373 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6t4wj\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-kube-api-access-6t4wj\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.797493 master-0 kubenswrapper[16352]: I0307 21:35:31.797429 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.814857 master-0 kubenswrapper[16352]: I0307 21:35:31.814783 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-bound-sa-token\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.818325 master-0 kubenswrapper[16352]: I0307 21:35:31.818238 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6t4wj\" (UniqueName: \"kubernetes.io/projected/90cf4c11-ae4e-43d3-ab43-2abf004e8ff6-kube-api-access-6t4wj\") pod \"cert-manager-cainjector-5545bd876-p74j2\" (UID: \"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6\") " pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:31.958252 master-0 kubenswrapper[16352]: I0307 21:35:31.958074 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" Mar 07 21:35:32.475646 master-0 kubenswrapper[16352]: I0307 21:35:32.474925 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-5545bd876-p74j2"] Mar 07 21:35:32.657989 master-0 kubenswrapper[16352]: I0307 21:35:32.657937 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" event={"ID":"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6","Type":"ContainerStarted","Data":"05463a339499df3c0c4727759837dfd46213b8e4d10266433f25e175c27695d9"} Mar 07 21:35:38.859762 master-0 kubenswrapper[16352]: I0307 21:35:38.859634 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb"] Mar 07 21:35:38.864231 master-0 kubenswrapper[16352]: I0307 21:35:38.863808 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:38.868844 master-0 kubenswrapper[16352]: I0307 21:35:38.868778 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 07 21:35:38.869053 master-0 kubenswrapper[16352]: I0307 21:35:38.868988 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 07 21:35:38.869177 master-0 kubenswrapper[16352]: I0307 21:35:38.869133 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 07 21:35:38.869408 master-0 kubenswrapper[16352]: I0307 21:35:38.869383 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 07 21:35:38.882979 master-0 kubenswrapper[16352]: I0307 21:35:38.882902 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb"] Mar 07 21:35:39.039810 master-0 kubenswrapper[16352]: I0307 21:35:39.039722 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-apiservice-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.039810 master-0 kubenswrapper[16352]: I0307 21:35:39.039814 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-webhook-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.040129 master-0 kubenswrapper[16352]: I0307 21:35:39.039892 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms9sk\" (UniqueName: \"kubernetes.io/projected/dd3c71c4-822b-4178-ac4e-e89659a2298b-kube-api-access-ms9sk\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.144894 master-0 kubenswrapper[16352]: I0307 21:35:39.141453 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms9sk\" (UniqueName: \"kubernetes.io/projected/dd3c71c4-822b-4178-ac4e-e89659a2298b-kube-api-access-ms9sk\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.144894 master-0 kubenswrapper[16352]: I0307 21:35:39.141916 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-apiservice-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.144894 master-0 kubenswrapper[16352]: I0307 21:35:39.141960 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-webhook-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.156222 master-0 kubenswrapper[16352]: I0307 21:35:39.153954 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-apiservice-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.156222 master-0 kubenswrapper[16352]: I0307 21:35:39.154481 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dd3c71c4-822b-4178-ac4e-e89659a2298b-webhook-cert\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.162650 master-0 kubenswrapper[16352]: I0307 21:35:39.162582 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms9sk\" (UniqueName: \"kubernetes.io/projected/dd3c71c4-822b-4178-ac4e-e89659a2298b-kube-api-access-ms9sk\") pod \"metallb-operator-controller-manager-547df9ff8b-bpxrb\" (UID: \"dd3c71c4-822b-4178-ac4e-e89659a2298b\") " pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.178475 master-0 kubenswrapper[16352]: I0307 21:35:39.178394 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh"] Mar 07 21:35:39.179693 master-0 kubenswrapper[16352]: I0307 21:35:39.179646 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.184663 master-0 kubenswrapper[16352]: I0307 21:35:39.184577 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 21:35:39.184878 master-0 kubenswrapper[16352]: I0307 21:35:39.184785 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 07 21:35:39.188697 master-0 kubenswrapper[16352]: I0307 21:35:39.188634 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:39.224166 master-0 kubenswrapper[16352]: I0307 21:35:39.224093 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh"] Mar 07 21:35:39.358498 master-0 kubenswrapper[16352]: I0307 21:35:39.349733 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-apiservice-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.358498 master-0 kubenswrapper[16352]: I0307 21:35:39.349890 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-webhook-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.358498 master-0 kubenswrapper[16352]: I0307 21:35:39.349934 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rc925\" (UniqueName: \"kubernetes.io/projected/abc5cb43-bca2-4115-a1a6-048526788102-kube-api-access-rc925\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.460645 master-0 kubenswrapper[16352]: I0307 21:35:39.452911 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-apiservice-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.460645 master-0 kubenswrapper[16352]: I0307 21:35:39.453011 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-webhook-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.460645 master-0 kubenswrapper[16352]: I0307 21:35:39.453047 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rc925\" (UniqueName: \"kubernetes.io/projected/abc5cb43-bca2-4115-a1a6-048526788102-kube-api-access-rc925\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.463963 master-0 kubenswrapper[16352]: I0307 21:35:39.463672 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-apiservice-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.464381 master-0 kubenswrapper[16352]: I0307 21:35:39.464293 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/abc5cb43-bca2-4115-a1a6-048526788102-webhook-cert\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.489871 master-0 kubenswrapper[16352]: I0307 21:35:39.488943 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rc925\" (UniqueName: \"kubernetes.io/projected/abc5cb43-bca2-4115-a1a6-048526788102-kube-api-access-rc925\") pod \"metallb-operator-webhook-server-57d6f574cc-8zmmh\" (UID: \"abc5cb43-bca2-4115-a1a6-048526788102\") " pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:39.566395 master-0 kubenswrapper[16352]: I0307 21:35:39.566317 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:41.079736 master-0 kubenswrapper[16352]: I0307 21:35:41.079557 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh"] Mar 07 21:35:41.147009 master-0 kubenswrapper[16352]: I0307 21:35:41.146942 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb"] Mar 07 21:35:41.788874 master-0 kubenswrapper[16352]: I0307 21:35:41.785613 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" event={"ID":"abc5cb43-bca2-4115-a1a6-048526788102","Type":"ContainerStarted","Data":"22d16a7ae12bab19a3c44649887963c571897805a8236e2c837b373498b5f97e"} Mar 07 21:35:41.788874 master-0 kubenswrapper[16352]: I0307 21:35:41.787935 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" event={"ID":"dd3c71c4-822b-4178-ac4e-e89659a2298b","Type":"ContainerStarted","Data":"77577d2018dd39873599b7cff309f88ce488e9bff5b66d1e18f31b4b335d908e"} Mar 07 21:35:41.810251 master-0 kubenswrapper[16352]: I0307 21:35:41.796933 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" event={"ID":"126fe395-9c81-464a-b5f7-a68c43e4fa53","Type":"ContainerStarted","Data":"865a60ceabf602e2baab15cf7255c9673963238a460fcd3ac2e31609bbe81ac4"} Mar 07 21:35:41.810251 master-0 kubenswrapper[16352]: I0307 21:35:41.798125 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:41.822114 master-0 kubenswrapper[16352]: I0307 21:35:41.821973 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" event={"ID":"90cf4c11-ae4e-43d3-ab43-2abf004e8ff6","Type":"ContainerStarted","Data":"29fe9c48185f422c6d217d38d689a756f20d18a47ccb37012666545ea261b167"} Mar 07 21:35:41.855712 master-0 kubenswrapper[16352]: I0307 21:35:41.854504 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" podStartSLOduration=2.526389764 podStartE2EDuration="11.854459307s" podCreationTimestamp="2026-03-07 21:35:30 +0000 UTC" firstStartedPulling="2026-03-07 21:35:31.290009923 +0000 UTC m=+1054.360714972" lastFinishedPulling="2026-03-07 21:35:40.618079456 +0000 UTC m=+1063.688784515" observedRunningTime="2026-03-07 21:35:41.843122017 +0000 UTC m=+1064.913827096" watchObservedRunningTime="2026-03-07 21:35:41.854459307 +0000 UTC m=+1064.925164386" Mar 07 21:35:41.890023 master-0 kubenswrapper[16352]: I0307 21:35:41.889922 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-5545bd876-p74j2" podStartSLOduration=2.713801949 podStartE2EDuration="10.88989557s" podCreationTimestamp="2026-03-07 21:35:31 +0000 UTC" firstStartedPulling="2026-03-07 21:35:32.479547969 +0000 UTC m=+1055.550253028" lastFinishedPulling="2026-03-07 21:35:40.65564159 +0000 UTC m=+1063.726346649" observedRunningTime="2026-03-07 21:35:41.880206399 +0000 UTC m=+1064.950911458" watchObservedRunningTime="2026-03-07 21:35:41.88989557 +0000 UTC m=+1064.960600629" Mar 07 21:35:42.025752 master-0 kubenswrapper[16352]: I0307 21:35:42.022601 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz"] Mar 07 21:35:42.025752 master-0 kubenswrapper[16352]: I0307 21:35:42.023775 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" Mar 07 21:35:42.032700 master-0 kubenswrapper[16352]: I0307 21:35:42.032584 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"openshift-service-ca.crt" Mar 07 21:35:42.033782 master-0 kubenswrapper[16352]: I0307 21:35:42.033482 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operators"/"kube-root-ca.crt" Mar 07 21:35:42.058284 master-0 kubenswrapper[16352]: I0307 21:35:42.058099 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz"] Mar 07 21:35:42.152657 master-0 kubenswrapper[16352]: I0307 21:35:42.152587 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sg77t\" (UniqueName: \"kubernetes.io/projected/5b844edb-e416-4cc4-9fc2-547b7c09f258-kube-api-access-sg77t\") pod \"obo-prometheus-operator-68bc856cb9-4flmz\" (UID: \"5b844edb-e416-4cc4-9fc2-547b7c09f258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" Mar 07 21:35:42.265715 master-0 kubenswrapper[16352]: I0307 21:35:42.258298 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sg77t\" (UniqueName: \"kubernetes.io/projected/5b844edb-e416-4cc4-9fc2-547b7c09f258-kube-api-access-sg77t\") pod \"obo-prometheus-operator-68bc856cb9-4flmz\" (UID: \"5b844edb-e416-4cc4-9fc2-547b7c09f258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" Mar 07 21:35:42.280565 master-0 kubenswrapper[16352]: I0307 21:35:42.273798 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq"] Mar 07 21:35:42.280565 master-0 kubenswrapper[16352]: I0307 21:35:42.276018 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.291641 master-0 kubenswrapper[16352]: I0307 21:35:42.285450 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"obo-prometheus-operator-admission-webhook-service-cert" Mar 07 21:35:42.300761 master-0 kubenswrapper[16352]: I0307 21:35:42.300358 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sg77t\" (UniqueName: \"kubernetes.io/projected/5b844edb-e416-4cc4-9fc2-547b7c09f258-kube-api-access-sg77t\") pod \"obo-prometheus-operator-68bc856cb9-4flmz\" (UID: \"5b844edb-e416-4cc4-9fc2-547b7c09f258\") " pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" Mar 07 21:35:42.311818 master-0 kubenswrapper[16352]: I0307 21:35:42.303356 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9"] Mar 07 21:35:42.311818 master-0 kubenswrapper[16352]: I0307 21:35:42.305801 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.337718 master-0 kubenswrapper[16352]: I0307 21:35:42.330231 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq"] Mar 07 21:35:42.365310 master-0 kubenswrapper[16352]: I0307 21:35:42.361732 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.365310 master-0 kubenswrapper[16352]: I0307 21:35:42.361816 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.365310 master-0 kubenswrapper[16352]: I0307 21:35:42.361979 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.365310 master-0 kubenswrapper[16352]: I0307 21:35:42.362049 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.366798 master-0 kubenswrapper[16352]: I0307 21:35:42.366593 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9"] Mar 07 21:35:42.405473 master-0 kubenswrapper[16352]: I0307 21:35:42.405371 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sn8gn"] Mar 07 21:35:42.410050 master-0 kubenswrapper[16352]: I0307 21:35:42.406862 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.421566 master-0 kubenswrapper[16352]: I0307 21:35:42.413211 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operators"/"observability-operator-tls" Mar 07 21:35:42.430567 master-0 kubenswrapper[16352]: I0307 21:35:42.424310 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sn8gn"] Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.463802 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.463895 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.463923 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.463981 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/653d9aa0-47bc-48bf-b250-6c0728b489c4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.464020 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.464715 master-0 kubenswrapper[16352]: I0307 21:35:42.464045 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckxzd\" (UniqueName: \"kubernetes.io/projected/653d9aa0-47bc-48bf-b250-6c0728b489c4-kube-api-access-ckxzd\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.467259 master-0 kubenswrapper[16352]: I0307 21:35:42.467221 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.468222 master-0 kubenswrapper[16352]: I0307 21:35:42.468172 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.468306 master-0 kubenswrapper[16352]: I0307 21:35:42.468225 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/712d0c3b-37b3-44f2-b515-808db3170312-apiservice-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq\" (UID: \"712d0c3b-37b3-44f2-b515-808db3170312\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.471700 master-0 kubenswrapper[16352]: I0307 21:35:42.471411 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ca09e002-34dc-4c68-804b-985b2b5545cd-webhook-cert\") pod \"obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9\" (UID: \"ca09e002-34dc-4c68-804b-985b2b5545cd\") " pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.563826 master-0 kubenswrapper[16352]: I0307 21:35:42.563152 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" Mar 07 21:35:42.572237 master-0 kubenswrapper[16352]: I0307 21:35:42.566110 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/653d9aa0-47bc-48bf-b250-6c0728b489c4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.572237 master-0 kubenswrapper[16352]: I0307 21:35:42.566234 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ckxzd\" (UniqueName: \"kubernetes.io/projected/653d9aa0-47bc-48bf-b250-6c0728b489c4-kube-api-access-ckxzd\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.572237 master-0 kubenswrapper[16352]: I0307 21:35:42.571138 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"observability-operator-tls\" (UniqueName: \"kubernetes.io/secret/653d9aa0-47bc-48bf-b250-6c0728b489c4-observability-operator-tls\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.598180 master-0 kubenswrapper[16352]: I0307 21:35:42.596213 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rcw9j"] Mar 07 21:35:42.598180 master-0 kubenswrapper[16352]: I0307 21:35:42.597337 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.602975 master-0 kubenswrapper[16352]: I0307 21:35:42.600881 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rcw9j"] Mar 07 21:35:42.611374 master-0 kubenswrapper[16352]: I0307 21:35:42.610219 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckxzd\" (UniqueName: \"kubernetes.io/projected/653d9aa0-47bc-48bf-b250-6c0728b489c4-kube-api-access-ckxzd\") pod \"observability-operator-59bdc8b94-sn8gn\" (UID: \"653d9aa0-47bc-48bf-b250-6c0728b489c4\") " pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.659340 master-0 kubenswrapper[16352]: I0307 21:35:42.656838 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" Mar 07 21:35:42.667927 master-0 kubenswrapper[16352]: I0307 21:35:42.667872 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc589985-1e1e-4d0d-b8ed-172eae644f33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.668003 master-0 kubenswrapper[16352]: I0307 21:35:42.667962 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6r8r4\" (UniqueName: \"kubernetes.io/projected/fc589985-1e1e-4d0d-b8ed-172eae644f33-kube-api-access-6r8r4\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.677476 master-0 kubenswrapper[16352]: I0307 21:35:42.674435 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" Mar 07 21:35:42.733172 master-0 kubenswrapper[16352]: I0307 21:35:42.730572 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:42.779717 master-0 kubenswrapper[16352]: I0307 21:35:42.770336 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc589985-1e1e-4d0d-b8ed-172eae644f33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.779717 master-0 kubenswrapper[16352]: I0307 21:35:42.770463 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6r8r4\" (UniqueName: \"kubernetes.io/projected/fc589985-1e1e-4d0d-b8ed-172eae644f33-kube-api-access-6r8r4\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.779717 master-0 kubenswrapper[16352]: I0307 21:35:42.771459 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openshift-service-ca\" (UniqueName: \"kubernetes.io/configmap/fc589985-1e1e-4d0d-b8ed-172eae644f33-openshift-service-ca\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:42.798981 master-0 kubenswrapper[16352]: I0307 21:35:42.790079 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6r8r4\" (UniqueName: \"kubernetes.io/projected/fc589985-1e1e-4d0d-b8ed-172eae644f33-kube-api-access-6r8r4\") pod \"perses-operator-5bf474d74f-rcw9j\" (UID: \"fc589985-1e1e-4d0d-b8ed-172eae644f33\") " pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:43.030539 master-0 kubenswrapper[16352]: I0307 21:35:43.030441 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:43.049693 master-0 kubenswrapper[16352]: I0307 21:35:43.049406 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz"] Mar 07 21:35:43.067797 master-0 kubenswrapper[16352]: W0307 21:35:43.063422 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b844edb_e416_4cc4_9fc2_547b7c09f258.slice/crio-70bb78f85f6adf62765fcfe0ef0a60aa09ee313b6ca097df01f4d15cc2b44d58 WatchSource:0}: Error finding container 70bb78f85f6adf62765fcfe0ef0a60aa09ee313b6ca097df01f4d15cc2b44d58: Status 404 returned error can't find the container with id 70bb78f85f6adf62765fcfe0ef0a60aa09ee313b6ca097df01f4d15cc2b44d58 Mar 07 21:35:43.177773 master-0 kubenswrapper[16352]: I0307 21:35:43.175733 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq"] Mar 07 21:35:43.320965 master-0 kubenswrapper[16352]: I0307 21:35:43.316115 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-548z6"] Mar 07 21:35:43.320965 master-0 kubenswrapper[16352]: I0307 21:35:43.318208 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" Mar 07 21:35:43.334306 master-0 kubenswrapper[16352]: W0307 21:35:43.334229 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podca09e002_34dc_4c68_804b_985b2b5545cd.slice/crio-343f6f19621235a818910dddc80588efb5634f278f6632f061994a4bbc8286a1 WatchSource:0}: Error finding container 343f6f19621235a818910dddc80588efb5634f278f6632f061994a4bbc8286a1: Status 404 returned error can't find the container with id 343f6f19621235a818910dddc80588efb5634f278f6632f061994a4bbc8286a1 Mar 07 21:35:43.334859 master-0 kubenswrapper[16352]: I0307 21:35:43.334823 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 07 21:35:43.334985 master-0 kubenswrapper[16352]: I0307 21:35:43.334919 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 07 21:35:43.354218 master-0 kubenswrapper[16352]: I0307 21:35:43.353708 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9"] Mar 07 21:35:43.380307 master-0 kubenswrapper[16352]: I0307 21:35:43.379676 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-548z6"] Mar 07 21:35:43.397507 master-0 kubenswrapper[16352]: I0307 21:35:43.392715 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2hf8h\" (UniqueName: \"kubernetes.io/projected/e1489fdb-f97d-49d9-86b5-384fae371409-kube-api-access-2hf8h\") pod \"nmstate-operator-75c5dccd6c-548z6\" (UID: \"e1489fdb-f97d-49d9-86b5-384fae371409\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" Mar 07 21:35:43.416363 master-0 kubenswrapper[16352]: I0307 21:35:43.416291 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/observability-operator-59bdc8b94-sn8gn"] Mar 07 21:35:43.495473 master-0 kubenswrapper[16352]: I0307 21:35:43.495391 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2hf8h\" (UniqueName: \"kubernetes.io/projected/e1489fdb-f97d-49d9-86b5-384fae371409-kube-api-access-2hf8h\") pod \"nmstate-operator-75c5dccd6c-548z6\" (UID: \"e1489fdb-f97d-49d9-86b5-384fae371409\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" Mar 07 21:35:43.528182 master-0 kubenswrapper[16352]: I0307 21:35:43.528124 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2hf8h\" (UniqueName: \"kubernetes.io/projected/e1489fdb-f97d-49d9-86b5-384fae371409-kube-api-access-2hf8h\") pod \"nmstate-operator-75c5dccd6c-548z6\" (UID: \"e1489fdb-f97d-49d9-86b5-384fae371409\") " pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" Mar 07 21:35:43.582893 master-0 kubenswrapper[16352]: I0307 21:35:43.579508 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operators/perses-operator-5bf474d74f-rcw9j"] Mar 07 21:35:43.655736 master-0 kubenswrapper[16352]: I0307 21:35:43.655655 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" Mar 07 21:35:43.863947 master-0 kubenswrapper[16352]: I0307 21:35:43.847971 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" event={"ID":"fc589985-1e1e-4d0d-b8ed-172eae644f33","Type":"ContainerStarted","Data":"51db4d1dd1adf04699d8494a837c8bbcc5839bc07243b0e527c5dc524da6806e"} Mar 07 21:35:43.863947 master-0 kubenswrapper[16352]: I0307 21:35:43.849718 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" event={"ID":"653d9aa0-47bc-48bf-b250-6c0728b489c4","Type":"ContainerStarted","Data":"9dc57e4a633a8e41af9b9ad6a7e57a1d08bb30f6955e66c35edbc81b3ce91028"} Mar 07 21:35:43.863947 master-0 kubenswrapper[16352]: I0307 21:35:43.851233 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" event={"ID":"ca09e002-34dc-4c68-804b-985b2b5545cd","Type":"ContainerStarted","Data":"343f6f19621235a818910dddc80588efb5634f278f6632f061994a4bbc8286a1"} Mar 07 21:35:43.863947 master-0 kubenswrapper[16352]: I0307 21:35:43.853291 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" event={"ID":"5b844edb-e416-4cc4-9fc2-547b7c09f258","Type":"ContainerStarted","Data":"70bb78f85f6adf62765fcfe0ef0a60aa09ee313b6ca097df01f4d15cc2b44d58"} Mar 07 21:35:43.863947 master-0 kubenswrapper[16352]: I0307 21:35:43.855757 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" event={"ID":"712d0c3b-37b3-44f2-b515-808db3170312","Type":"ContainerStarted","Data":"1f248dc26252de98dca678296af560786e30b4f0535326ee6ff87285d406b9aa"} Mar 07 21:35:44.211794 master-0 kubenswrapper[16352]: I0307 21:35:44.207889 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-75c5dccd6c-548z6"] Mar 07 21:35:44.895060 master-0 kubenswrapper[16352]: I0307 21:35:44.894963 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" event={"ID":"e1489fdb-f97d-49d9-86b5-384fae371409","Type":"ContainerStarted","Data":"058ed6432b5210c76df780b3625a821d852651c518dd0121c307a8dc33541321"} Mar 07 21:35:45.786457 master-0 kubenswrapper[16352]: I0307 21:35:45.786365 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-6888856db4-vmqtf" Mar 07 21:35:46.792088 master-0 kubenswrapper[16352]: I0307 21:35:46.785966 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-545d4d4674-fz858"] Mar 07 21:35:46.792088 master-0 kubenswrapper[16352]: I0307 21:35:46.789641 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:46.802197 master-0 kubenswrapper[16352]: I0307 21:35:46.802113 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-fz858"] Mar 07 21:35:46.894433 master-0 kubenswrapper[16352]: I0307 21:35:46.894344 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svcqx\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-kube-api-access-svcqx\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:46.894821 master-0 kubenswrapper[16352]: I0307 21:35:46.894472 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-bound-sa-token\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:47.003525 master-0 kubenswrapper[16352]: I0307 21:35:47.003438 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svcqx\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-kube-api-access-svcqx\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:47.003929 master-0 kubenswrapper[16352]: I0307 21:35:47.003713 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-bound-sa-token\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:47.027531 master-0 kubenswrapper[16352]: I0307 21:35:47.027044 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-bound-sa-token\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:47.028754 master-0 kubenswrapper[16352]: I0307 21:35:47.028645 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svcqx\" (UniqueName: \"kubernetes.io/projected/44c397ea-73f4-47dd-b99c-be149986ccfe-kube-api-access-svcqx\") pod \"cert-manager-545d4d4674-fz858\" (UID: \"44c397ea-73f4-47dd-b99c-be149986ccfe\") " pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:47.115024 master-0 kubenswrapper[16352]: I0307 21:35:47.114934 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-545d4d4674-fz858" Mar 07 21:35:48.667814 master-0 kubenswrapper[16352]: I0307 21:35:48.667712 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6f9c4688bb-5k492" podUID="765298df-6296-4283-a8dc-20135b6765ea" containerName="console" containerID="cri-o://12d200dab0789d8cea4ca8b081e47f4b9900b9437626abbfe197ed73e6e18b6d" gracePeriod=15 Mar 07 21:35:49.567707 master-0 kubenswrapper[16352]: I0307 21:35:49.566268 16352 patch_prober.go:28] interesting pod/console-6f9c4688bb-5k492 container/console namespace/openshift-console: Readiness probe status=failure output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" start-of-body= Mar 07 21:35:49.567707 master-0 kubenswrapper[16352]: I0307 21:35:49.566341 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/console-6f9c4688bb-5k492" podUID="765298df-6296-4283-a8dc-20135b6765ea" containerName="console" probeResult="failure" output="Get \"https://10.128.0.106:8443/health\": dial tcp 10.128.0.106:8443: connect: connection refused" Mar 07 21:35:55.014334 master-0 kubenswrapper[16352]: I0307 21:35:55.014279 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f9c4688bb-5k492_765298df-6296-4283-a8dc-20135b6765ea/console/0.log" Mar 07 21:35:55.015072 master-0 kubenswrapper[16352]: I0307 21:35:55.014345 16352 generic.go:334] "Generic (PLEG): container finished" podID="765298df-6296-4283-a8dc-20135b6765ea" containerID="12d200dab0789d8cea4ca8b081e47f4b9900b9437626abbfe197ed73e6e18b6d" exitCode=2 Mar 07 21:35:55.015072 master-0 kubenswrapper[16352]: I0307 21:35:55.014389 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9c4688bb-5k492" event={"ID":"765298df-6296-4283-a8dc-20135b6765ea","Type":"ContainerDied","Data":"12d200dab0789d8cea4ca8b081e47f4b9900b9437626abbfe197ed73e6e18b6d"} Mar 07 21:35:56.632743 master-0 kubenswrapper[16352]: I0307 21:35:56.632582 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f9c4688bb-5k492_765298df-6296-4283-a8dc-20135b6765ea/console/0.log" Mar 07 21:35:56.633559 master-0 kubenswrapper[16352]: I0307 21:35:56.632889 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:35:56.721667 master-0 kubenswrapper[16352]: I0307 21:35:56.721611 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.721820 master-0 kubenswrapper[16352]: I0307 21:35:56.721786 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.721884 master-0 kubenswrapper[16352]: I0307 21:35:56.721820 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.721995 master-0 kubenswrapper[16352]: I0307 21:35:56.721962 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.722052 master-0 kubenswrapper[16352]: I0307 21:35:56.721997 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-62cnz\" (UniqueName: \"kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.722103 master-0 kubenswrapper[16352]: I0307 21:35:56.722073 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.722160 master-0 kubenswrapper[16352]: I0307 21:35:56.722101 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config\") pod \"765298df-6296-4283-a8dc-20135b6765ea\" (UID: \"765298df-6296-4283-a8dc-20135b6765ea\") " Mar 07 21:35:56.722160 master-0 kubenswrapper[16352]: I0307 21:35:56.722108 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca" (OuterVolumeSpecName: "service-ca") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:35:56.722510 master-0 kubenswrapper[16352]: I0307 21:35:56.722484 16352 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.722819 master-0 kubenswrapper[16352]: I0307 21:35:56.722759 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config" (OuterVolumeSpecName: "console-config") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:35:56.722927 master-0 kubenswrapper[16352]: I0307 21:35:56.722843 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:35:56.722927 master-0 kubenswrapper[16352]: I0307 21:35:56.722876 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:35:56.725302 master-0 kubenswrapper[16352]: I0307 21:35:56.725216 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:35:56.726194 master-0 kubenswrapper[16352]: I0307 21:35:56.726156 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz" (OuterVolumeSpecName: "kube-api-access-62cnz") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "kube-api-access-62cnz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:35:56.727020 master-0 kubenswrapper[16352]: I0307 21:35:56.726975 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "765298df-6296-4283-a8dc-20135b6765ea" (UID: "765298df-6296-4283-a8dc-20135b6765ea"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:35:56.825012 master-0 kubenswrapper[16352]: I0307 21:35:56.824946 16352 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-console-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.825012 master-0 kubenswrapper[16352]: I0307 21:35:56.824994 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-62cnz\" (UniqueName: \"kubernetes.io/projected/765298df-6296-4283-a8dc-20135b6765ea-kube-api-access-62cnz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.825012 master-0 kubenswrapper[16352]: I0307 21:35:56.825005 16352 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.825012 master-0 kubenswrapper[16352]: I0307 21:35:56.825015 16352 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.825416 master-0 kubenswrapper[16352]: I0307 21:35:56.825025 16352 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/765298df-6296-4283-a8dc-20135b6765ea-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:56.825416 master-0 kubenswrapper[16352]: I0307 21:35:56.825035 16352 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/765298df-6296-4283-a8dc-20135b6765ea-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:35:57.040938 master-0 kubenswrapper[16352]: I0307 21:35:57.040073 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f9c4688bb-5k492_765298df-6296-4283-a8dc-20135b6765ea/console/0.log" Mar 07 21:35:57.040938 master-0 kubenswrapper[16352]: I0307 21:35:57.040219 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f9c4688bb-5k492" Mar 07 21:35:57.040938 master-0 kubenswrapper[16352]: I0307 21:35:57.040750 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f9c4688bb-5k492" event={"ID":"765298df-6296-4283-a8dc-20135b6765ea","Type":"ContainerDied","Data":"a7832ac75f5c5a132598b682a567a7c8129e13c5265490814e93d667d576ff53"} Mar 07 21:35:57.040938 master-0 kubenswrapper[16352]: I0307 21:35:57.040851 16352 scope.go:117] "RemoveContainer" containerID="12d200dab0789d8cea4ca8b081e47f4b9900b9437626abbfe197ed73e6e18b6d" Mar 07 21:35:57.047702 master-0 kubenswrapper[16352]: I0307 21:35:57.047634 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" event={"ID":"653d9aa0-47bc-48bf-b250-6c0728b489c4","Type":"ContainerStarted","Data":"661f28d94d766c1900a0ecfded1a3c9447dc67d97dadfa7194c1f26ca3a395c2"} Mar 07 21:35:57.049077 master-0 kubenswrapper[16352]: I0307 21:35:57.049052 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:57.054940 master-0 kubenswrapper[16352]: I0307 21:35:57.054859 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" event={"ID":"dd3c71c4-822b-4178-ac4e-e89659a2298b","Type":"ContainerStarted","Data":"0be143957cb3615bd7a664d8a5e94610194d8ed5b70da4fab1f8d37149b960e6"} Mar 07 21:35:57.055106 master-0 kubenswrapper[16352]: I0307 21:35:57.055087 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:35:57.058214 master-0 kubenswrapper[16352]: I0307 21:35:57.058176 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" Mar 07 21:35:57.081228 master-0 kubenswrapper[16352]: I0307 21:35:57.081095 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/observability-operator-59bdc8b94-sn8gn" podStartSLOduration=1.8572030050000001 podStartE2EDuration="15.081057092s" podCreationTimestamp="2026-03-07 21:35:42 +0000 UTC" firstStartedPulling="2026-03-07 21:35:43.416921708 +0000 UTC m=+1066.487626777" lastFinishedPulling="2026-03-07 21:35:56.640775805 +0000 UTC m=+1079.711480864" observedRunningTime="2026-03-07 21:35:57.077587429 +0000 UTC m=+1080.148292488" watchObservedRunningTime="2026-03-07 21:35:57.081057092 +0000 UTC m=+1080.151762151" Mar 07 21:35:57.118503 master-0 kubenswrapper[16352]: I0307 21:35:57.118424 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:35:57.133153 master-0 kubenswrapper[16352]: I0307 21:35:57.131612 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f9c4688bb-5k492"] Mar 07 21:35:57.151785 master-0 kubenswrapper[16352]: I0307 21:35:57.151616 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-545d4d4674-fz858"] Mar 07 21:35:57.154694 master-0 kubenswrapper[16352]: I0307 21:35:57.154615 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" podStartSLOduration=4.199733512 podStartE2EDuration="19.154600782s" podCreationTimestamp="2026-03-07 21:35:38 +0000 UTC" firstStartedPulling="2026-03-07 21:35:41.151423417 +0000 UTC m=+1064.222128476" lastFinishedPulling="2026-03-07 21:35:56.106290687 +0000 UTC m=+1079.176995746" observedRunningTime="2026-03-07 21:35:57.141369857 +0000 UTC m=+1080.212074916" watchObservedRunningTime="2026-03-07 21:35:57.154600782 +0000 UTC m=+1080.225305841" Mar 07 21:35:57.207424 master-0 kubenswrapper[16352]: I0307 21:35:57.204699 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="765298df-6296-4283-a8dc-20135b6765ea" path="/var/lib/kubelet/pods/765298df-6296-4283-a8dc-20135b6765ea/volumes" Mar 07 21:35:58.070247 master-0 kubenswrapper[16352]: I0307 21:35:58.070141 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" event={"ID":"5b844edb-e416-4cc4-9fc2-547b7c09f258","Type":"ContainerStarted","Data":"1f0fd4e0476be5b00d440c9c1bec17825e838291925296d140523b567b4683df"} Mar 07 21:35:58.073674 master-0 kubenswrapper[16352]: I0307 21:35:58.073565 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" event={"ID":"712d0c3b-37b3-44f2-b515-808db3170312","Type":"ContainerStarted","Data":"2f2c6f7b7a16a11e1b13edc38f5de51cbc1049054ce3fb6ad387e809193ed4db"} Mar 07 21:35:58.076529 master-0 kubenswrapper[16352]: I0307 21:35:58.076451 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" event={"ID":"fc589985-1e1e-4d0d-b8ed-172eae644f33","Type":"ContainerStarted","Data":"eed1be180e057a3817991b39c79cc4b33fb027174438b04f3df4d3d6dddfe0d3"} Mar 07 21:35:58.077017 master-0 kubenswrapper[16352]: I0307 21:35:58.076965 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:35:58.079531 master-0 kubenswrapper[16352]: I0307 21:35:58.079434 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" event={"ID":"e1489fdb-f97d-49d9-86b5-384fae371409","Type":"ContainerStarted","Data":"f35f8687a2d2f523273f3c6a5b760a76c27e427f1878c195780a9582aaad2a52"} Mar 07 21:35:58.082107 master-0 kubenswrapper[16352]: I0307 21:35:58.082036 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" event={"ID":"abc5cb43-bca2-4115-a1a6-048526788102","Type":"ContainerStarted","Data":"843c2a6e82849d05fe9532cf25bf0c155075a0c477c8da69f0b72f312636123d"} Mar 07 21:35:58.082184 master-0 kubenswrapper[16352]: I0307 21:35:58.082124 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:35:58.084646 master-0 kubenswrapper[16352]: I0307 21:35:58.084597 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-fz858" event={"ID":"44c397ea-73f4-47dd-b99c-be149986ccfe","Type":"ContainerStarted","Data":"c521e6d998ff6d5394788b77d6824bd758958a580bc7626d721851fed37973f6"} Mar 07 21:35:58.084763 master-0 kubenswrapper[16352]: I0307 21:35:58.084648 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-545d4d4674-fz858" event={"ID":"44c397ea-73f4-47dd-b99c-be149986ccfe","Type":"ContainerStarted","Data":"44ae7ebadf54749203ac26baf599f2a4563b02575fe13a94bf83f1e8bc5af9f5"} Mar 07 21:35:58.089914 master-0 kubenswrapper[16352]: I0307 21:35:58.089854 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" event={"ID":"ca09e002-34dc-4c68-804b-985b2b5545cd","Type":"ContainerStarted","Data":"c7b1707a296335450d1571d3423b6089e563a0dff02a3cf872eeabf20b84a339"} Mar 07 21:35:58.105564 master-0 kubenswrapper[16352]: I0307 21:35:58.105426 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-68bc856cb9-4flmz" podStartSLOduration=3.489364057 podStartE2EDuration="17.105393167s" podCreationTimestamp="2026-03-07 21:35:41 +0000 UTC" firstStartedPulling="2026-03-07 21:35:43.073018864 +0000 UTC m=+1066.143723923" lastFinishedPulling="2026-03-07 21:35:56.689047974 +0000 UTC m=+1079.759753033" observedRunningTime="2026-03-07 21:35:58.096775102 +0000 UTC m=+1081.167480241" watchObservedRunningTime="2026-03-07 21:35:58.105393167 +0000 UTC m=+1081.176098266" Mar 07 21:35:58.167977 master-0 kubenswrapper[16352]: I0307 21:35:58.166605 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-545d4d4674-fz858" podStartSLOduration=12.166577613 podStartE2EDuration="12.166577613s" podCreationTimestamp="2026-03-07 21:35:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:35:58.145075712 +0000 UTC m=+1081.215780831" watchObservedRunningTime="2026-03-07 21:35:58.166577613 +0000 UTC m=+1081.237282692" Mar 07 21:35:58.207359 master-0 kubenswrapper[16352]: I0307 21:35:58.205404 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wzsq9" podStartSLOduration=2.853468862 podStartE2EDuration="16.205375087s" podCreationTimestamp="2026-03-07 21:35:42 +0000 UTC" firstStartedPulling="2026-03-07 21:35:43.35315608 +0000 UTC m=+1066.423861139" lastFinishedPulling="2026-03-07 21:35:56.705062315 +0000 UTC m=+1079.775767364" observedRunningTime="2026-03-07 21:35:58.189010777 +0000 UTC m=+1081.259715856" watchObservedRunningTime="2026-03-07 21:35:58.205375087 +0000 UTC m=+1081.276080176" Mar 07 21:35:58.253167 master-0 kubenswrapper[16352]: I0307 21:35:58.252484 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" podStartSLOduration=3.167314561 podStartE2EDuration="16.252463048s" podCreationTimestamp="2026-03-07 21:35:42 +0000 UTC" firstStartedPulling="2026-03-07 21:35:43.586670377 +0000 UTC m=+1066.657375436" lastFinishedPulling="2026-03-07 21:35:56.671818874 +0000 UTC m=+1079.742523923" observedRunningTime="2026-03-07 21:35:58.232522772 +0000 UTC m=+1081.303227841" watchObservedRunningTime="2026-03-07 21:35:58.252463048 +0000 UTC m=+1081.323168107" Mar 07 21:35:58.264132 master-0 kubenswrapper[16352]: I0307 21:35:58.263712 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" podStartSLOduration=3.718039296 podStartE2EDuration="19.263670534s" podCreationTimestamp="2026-03-07 21:35:39 +0000 UTC" firstStartedPulling="2026-03-07 21:35:41.095407464 +0000 UTC m=+1064.166112523" lastFinishedPulling="2026-03-07 21:35:56.641038702 +0000 UTC m=+1079.711743761" observedRunningTime="2026-03-07 21:35:58.260082179 +0000 UTC m=+1081.330787248" watchObservedRunningTime="2026-03-07 21:35:58.263670534 +0000 UTC m=+1081.334375583" Mar 07 21:35:58.292061 master-0 kubenswrapper[16352]: I0307 21:35:58.291863 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-75c5dccd6c-548z6" podStartSLOduration=2.857428332 podStartE2EDuration="15.291822394s" podCreationTimestamp="2026-03-07 21:35:43 +0000 UTC" firstStartedPulling="2026-03-07 21:35:44.25373355 +0000 UTC m=+1067.324438609" lastFinishedPulling="2026-03-07 21:35:56.688127612 +0000 UTC m=+1079.758832671" observedRunningTime="2026-03-07 21:35:58.288570886 +0000 UTC m=+1081.359275975" watchObservedRunningTime="2026-03-07 21:35:58.291822394 +0000 UTC m=+1081.362527473" Mar 07 21:35:58.333565 master-0 kubenswrapper[16352]: I0307 21:35:58.331971 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/obo-prometheus-operator-admission-webhook-7c4564c96f-wxwkq" podStartSLOduration=2.864431182 podStartE2EDuration="16.331867817s" podCreationTimestamp="2026-03-07 21:35:42 +0000 UTC" firstStartedPulling="2026-03-07 21:35:43.204709827 +0000 UTC m=+1066.275414886" lastFinishedPulling="2026-03-07 21:35:56.672146462 +0000 UTC m=+1079.742851521" observedRunningTime="2026-03-07 21:35:58.315445035 +0000 UTC m=+1081.386150094" watchObservedRunningTime="2026-03-07 21:35:58.331867817 +0000 UTC m=+1081.402572876" Mar 07 21:36:03.037499 master-0 kubenswrapper[16352]: I0307 21:36:03.037417 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/perses-operator-5bf474d74f-rcw9j" Mar 07 21:36:09.579398 master-0 kubenswrapper[16352]: I0307 21:36:09.579331 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-57d6f574cc-8zmmh" Mar 07 21:36:29.206198 master-0 kubenswrapper[16352]: I0307 21:36:29.206103 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-547df9ff8b-bpxrb" Mar 07 21:36:36.714528 master-0 kubenswrapper[16352]: I0307 21:36:36.714159 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-9cvbt"] Mar 07 21:36:36.715547 master-0 kubenswrapper[16352]: E0307 21:36:36.714780 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="765298df-6296-4283-a8dc-20135b6765ea" containerName="console" Mar 07 21:36:36.715547 master-0 kubenswrapper[16352]: I0307 21:36:36.714799 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="765298df-6296-4283-a8dc-20135b6765ea" containerName="console" Mar 07 21:36:36.715547 master-0 kubenswrapper[16352]: I0307 21:36:36.714990 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="765298df-6296-4283-a8dc-20135b6765ea" containerName="console" Mar 07 21:36:36.731776 master-0 kubenswrapper[16352]: I0307 21:36:36.731228 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.734592 master-0 kubenswrapper[16352]: I0307 21:36:36.734560 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 07 21:36:36.737799 master-0 kubenswrapper[16352]: I0307 21:36:36.737742 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 07 21:36:36.744048 master-0 kubenswrapper[16352]: I0307 21:36:36.743966 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67"] Mar 07 21:36:36.746013 master-0 kubenswrapper[16352]: I0307 21:36:36.745978 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.755267 master-0 kubenswrapper[16352]: I0307 21:36:36.750966 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 07 21:36:36.763775 master-0 kubenswrapper[16352]: I0307 21:36:36.761823 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67"] Mar 07 21:36:36.763775 master-0 kubenswrapper[16352]: I0307 21:36:36.763547 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-startup\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.763775 master-0 kubenswrapper[16352]: I0307 21:36:36.763597 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgklp\" (UniqueName: \"kubernetes.io/projected/88c33979-3534-44e1-a236-519aabbb8682-kube-api-access-fgklp\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.763775 master-0 kubenswrapper[16352]: I0307 21:36:36.763661 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-reloader\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.765546 master-0 kubenswrapper[16352]: I0307 21:36:36.764098 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c33979-3534-44e1-a236-519aabbb8682-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.765954 master-0 kubenswrapper[16352]: I0307 21:36:36.765822 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fh7pm\" (UniqueName: \"kubernetes.io/projected/869f7c43-071a-4079-b7f4-5d97cf58085e-kube-api-access-fh7pm\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.775921 master-0 kubenswrapper[16352]: I0307 21:36:36.774948 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-conf\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.775921 master-0 kubenswrapper[16352]: I0307 21:36:36.775021 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-sockets\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.775921 master-0 kubenswrapper[16352]: I0307 21:36:36.775053 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.775921 master-0 kubenswrapper[16352]: I0307 21:36:36.775115 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.834859 master-0 kubenswrapper[16352]: I0307 21:36:36.834668 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-lnt6b"] Mar 07 21:36:36.837022 master-0 kubenswrapper[16352]: I0307 21:36:36.836956 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.843392 master-0 kubenswrapper[16352]: I0307 21:36:36.843330 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 07 21:36:36.843634 master-0 kubenswrapper[16352]: I0307 21:36:36.843432 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 07 21:36:36.843634 master-0 kubenswrapper[16352]: I0307 21:36:36.843351 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 07 21:36:36.867954 master-0 kubenswrapper[16352]: I0307 21:36:36.867818 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-86ddb6bd46-nx428"] Mar 07 21:36:36.870786 master-0 kubenswrapper[16352]: I0307 21:36:36.870198 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:36.872754 master-0 kubenswrapper[16352]: I0307 21:36:36.872634 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 07 21:36:36.873812 master-0 kubenswrapper[16352]: I0307 21:36:36.873745 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-nx428"] Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880114 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-conf\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880177 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-sockets\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880287 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880505 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880622 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-conf\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880819 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-startup\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880873 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgklp\" (UniqueName: \"kubernetes.io/projected/88c33979-3534-44e1-a236-519aabbb8682-kube-api-access-fgklp\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880908 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-reloader\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880967 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhhh7\" (UniqueName: \"kubernetes.io/projected/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-kube-api-access-mhhh7\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880995 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.881010 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c33979-3534-44e1-a236-519aabbb8682-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.881055 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.881230 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.881312 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metallb-excludel2\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.881402 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fh7pm\" (UniqueName: \"kubernetes.io/projected/869f7c43-071a-4079-b7f4-5d97cf58085e-kube-api-access-fh7pm\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.880887 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-sockets\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.882978 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/869f7c43-071a-4079-b7f4-5d97cf58085e-frr-startup\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.883618 master-0 kubenswrapper[16352]: I0307 21:36:36.883104 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/869f7c43-071a-4079-b7f4-5d97cf58085e-reloader\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.885335 master-0 kubenswrapper[16352]: E0307 21:36:36.885279 16352 secret.go:189] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 07 21:36:36.885409 master-0 kubenswrapper[16352]: E0307 21:36:36.885381 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs podName:869f7c43-071a-4079-b7f4-5d97cf58085e nodeName:}" failed. No retries permitted until 2026-03-07 21:36:37.385353977 +0000 UTC m=+1120.456059036 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs") pod "frr-k8s-9cvbt" (UID: "869f7c43-071a-4079-b7f4-5d97cf58085e") : secret "frr-k8s-certs-secret" not found Mar 07 21:36:36.889223 master-0 kubenswrapper[16352]: I0307 21:36:36.889130 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/88c33979-3534-44e1-a236-519aabbb8682-cert\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.903160 master-0 kubenswrapper[16352]: I0307 21:36:36.902980 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgklp\" (UniqueName: \"kubernetes.io/projected/88c33979-3534-44e1-a236-519aabbb8682-kube-api-access-fgklp\") pod \"frr-k8s-webhook-server-7f989f654f-vnw67\" (UID: \"88c33979-3534-44e1-a236-519aabbb8682\") " pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:36.916405 master-0 kubenswrapper[16352]: I0307 21:36:36.916007 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fh7pm\" (UniqueName: \"kubernetes.io/projected/869f7c43-071a-4079-b7f4-5d97cf58085e-kube-api-access-fh7pm\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:36.984065 master-0 kubenswrapper[16352]: I0307 21:36:36.983982 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-cert\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:36.984065 master-0 kubenswrapper[16352]: I0307 21:36:36.984065 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhhh7\" (UniqueName: \"kubernetes.io/projected/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-kube-api-access-mhhh7\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.984343 master-0 kubenswrapper[16352]: I0307 21:36:36.984095 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.984343 master-0 kubenswrapper[16352]: I0307 21:36:36.984146 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.984343 master-0 kubenswrapper[16352]: I0307 21:36:36.984167 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sk7p\" (UniqueName: \"kubernetes.io/projected/ccc2f4b9-4661-4eee-8257-f6de6f473f00-kube-api-access-8sk7p\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:36.984343 master-0 kubenswrapper[16352]: I0307 21:36:36.984203 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metallb-excludel2\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.984506 master-0 kubenswrapper[16352]: E0307 21:36:36.984423 16352 secret.go:189] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 07 21:36:36.984743 master-0 kubenswrapper[16352]: E0307 21:36:36.984552 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs podName:abc6bb30-8337-4e94-a66b-6dc1a30c3bec nodeName:}" failed. No retries permitted until 2026-03-07 21:36:37.484524421 +0000 UTC m=+1120.555229480 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs") pod "speaker-lnt6b" (UID: "abc6bb30-8337-4e94-a66b-6dc1a30c3bec") : secret "speaker-certs-secret" not found Mar 07 21:36:36.984896 master-0 kubenswrapper[16352]: I0307 21:36:36.984867 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-metrics-certs\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:36.986326 master-0 kubenswrapper[16352]: I0307 21:36:36.986260 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metallb-excludel2\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:36.986422 master-0 kubenswrapper[16352]: E0307 21:36:36.986383 16352 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 21:36:36.986422 master-0 kubenswrapper[16352]: E0307 21:36:36.986416 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist podName:abc6bb30-8337-4e94-a66b-6dc1a30c3bec nodeName:}" failed. No retries permitted until 2026-03-07 21:36:37.486406857 +0000 UTC m=+1120.557111916 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist") pod "speaker-lnt6b" (UID: "abc6bb30-8337-4e94-a66b-6dc1a30c3bec") : secret "metallb-memberlist" not found Mar 07 21:36:37.005121 master-0 kubenswrapper[16352]: I0307 21:36:37.002874 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhhh7\" (UniqueName: \"kubernetes.io/projected/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-kube-api-access-mhhh7\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:37.087713 master-0 kubenswrapper[16352]: I0307 21:36:37.087608 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sk7p\" (UniqueName: \"kubernetes.io/projected/ccc2f4b9-4661-4eee-8257-f6de6f473f00-kube-api-access-8sk7p\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.088036 master-0 kubenswrapper[16352]: I0307 21:36:37.087959 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-metrics-certs\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.088097 master-0 kubenswrapper[16352]: I0307 21:36:37.088033 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-cert\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.090489 master-0 kubenswrapper[16352]: I0307 21:36:37.089796 16352 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 07 21:36:37.091197 master-0 kubenswrapper[16352]: I0307 21:36:37.091143 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-metrics-certs\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.096984 master-0 kubenswrapper[16352]: I0307 21:36:37.096945 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:37.109821 master-0 kubenswrapper[16352]: I0307 21:36:37.109761 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/ccc2f4b9-4661-4eee-8257-f6de6f473f00-cert\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.112853 master-0 kubenswrapper[16352]: I0307 21:36:37.112040 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sk7p\" (UniqueName: \"kubernetes.io/projected/ccc2f4b9-4661-4eee-8257-f6de6f473f00-kube-api-access-8sk7p\") pod \"controller-86ddb6bd46-nx428\" (UID: \"ccc2f4b9-4661-4eee-8257-f6de6f473f00\") " pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.275739 master-0 kubenswrapper[16352]: I0307 21:36:37.275572 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:37.405517 master-0 kubenswrapper[16352]: I0307 21:36:37.403342 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:37.416573 master-0 kubenswrapper[16352]: I0307 21:36:37.416472 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/869f7c43-071a-4079-b7f4-5d97cf58085e-metrics-certs\") pod \"frr-k8s-9cvbt\" (UID: \"869f7c43-071a-4079-b7f4-5d97cf58085e\") " pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:37.506214 master-0 kubenswrapper[16352]: I0307 21:36:37.506137 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:37.506436 master-0 kubenswrapper[16352]: I0307 21:36:37.506289 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:37.506643 master-0 kubenswrapper[16352]: E0307 21:36:37.506583 16352 secret.go:189] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 07 21:36:37.506755 master-0 kubenswrapper[16352]: E0307 21:36:37.506738 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist podName:abc6bb30-8337-4e94-a66b-6dc1a30c3bec nodeName:}" failed. No retries permitted until 2026-03-07 21:36:38.506710104 +0000 UTC m=+1121.577415183 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist") pod "speaker-lnt6b" (UID: "abc6bb30-8337-4e94-a66b-6dc1a30c3bec") : secret "metallb-memberlist" not found Mar 07 21:36:37.509938 master-0 kubenswrapper[16352]: I0307 21:36:37.509863 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-metrics-certs\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:37.605836 master-0 kubenswrapper[16352]: I0307 21:36:37.605737 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67"] Mar 07 21:36:37.684658 master-0 kubenswrapper[16352]: I0307 21:36:37.684392 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:37.762370 master-0 kubenswrapper[16352]: I0307 21:36:37.762283 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-86ddb6bd46-nx428"] Mar 07 21:36:38.530491 master-0 kubenswrapper[16352]: I0307 21:36:38.530391 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:38.551356 master-0 kubenswrapper[16352]: I0307 21:36:38.551288 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/abc6bb30-8337-4e94-a66b-6dc1a30c3bec-memberlist\") pod \"speaker-lnt6b\" (UID: \"abc6bb30-8337-4e94-a66b-6dc1a30c3bec\") " pod="metallb-system/speaker-lnt6b" Mar 07 21:36:38.568617 master-0 kubenswrapper[16352]: I0307 21:36:38.568549 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-nx428" event={"ID":"ccc2f4b9-4661-4eee-8257-f6de6f473f00","Type":"ContainerStarted","Data":"f472830511fd1e12d6dd64e097b7324d67dc8db491fe507bd96aa1ff4f8da877"} Mar 07 21:36:38.568617 master-0 kubenswrapper[16352]: I0307 21:36:38.568616 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-nx428" event={"ID":"ccc2f4b9-4661-4eee-8257-f6de6f473f00","Type":"ContainerStarted","Data":"4947b567185c9eadcf37cf1d46de378c95e896de8a4ec618275b6cf8c6402e99"} Mar 07 21:36:38.570610 master-0 kubenswrapper[16352]: I0307 21:36:38.570560 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" event={"ID":"88c33979-3534-44e1-a236-519aabbb8682","Type":"ContainerStarted","Data":"2f472270d9840d01057d9054934d6deeeae8e27ea6ac1994fa6b6ef208879b5b"} Mar 07 21:36:38.580533 master-0 kubenswrapper[16352]: I0307 21:36:38.580488 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"33ffb40f53469d74ff3409097146c429c0b05f0db07f2043517f4a674ed2cf3e"} Mar 07 21:36:38.719163 master-0 kubenswrapper[16352]: I0307 21:36:38.719081 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-26sjk"] Mar 07 21:36:38.720835 master-0 kubenswrapper[16352]: I0307 21:36:38.720803 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" Mar 07 21:36:38.739890 master-0 kubenswrapper[16352]: I0307 21:36:38.739597 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs"] Mar 07 21:36:38.742445 master-0 kubenswrapper[16352]: I0307 21:36:38.740953 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.742965 master-0 kubenswrapper[16352]: I0307 21:36:38.742901 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 07 21:36:38.754288 master-0 kubenswrapper[16352]: I0307 21:36:38.753789 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-26sjk"] Mar 07 21:36:38.764189 master-0 kubenswrapper[16352]: I0307 21:36:38.764119 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-9lvhn"] Mar 07 21:36:38.765726 master-0 kubenswrapper[16352]: I0307 21:36:38.765663 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.765801 master-0 kubenswrapper[16352]: I0307 21:36:38.765669 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-lnt6b" Mar 07 21:36:38.773630 master-0 kubenswrapper[16352]: I0307 21:36:38.773567 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs"] Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.836821 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aafd47a0-9f29-4048-affa-55d19cb13b7b-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.836873 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-dbus-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.836909 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-nmstate-lock\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.836972 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpf99\" (UniqueName: \"kubernetes.io/projected/aafd47a0-9f29-4048-affa-55d19cb13b7b-kube-api-access-wpf99\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.837004 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-ovs-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.837140 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gql2s\" (UniqueName: \"kubernetes.io/projected/7e58bc66-1200-424b-b5a2-d2de58a5a731-kube-api-access-gql2s\") pod \"nmstate-metrics-69594cc75-26sjk\" (UID: \"7e58bc66-1200-424b-b5a2-d2de58a5a731\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" Mar 07 21:36:38.845994 master-0 kubenswrapper[16352]: I0307 21:36:38.837192 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jkk5g\" (UniqueName: \"kubernetes.io/projected/a2a4ecb5-0643-497e-a485-1d73c93739de-kube-api-access-jkk5g\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.863935 master-0 kubenswrapper[16352]: W0307 21:36:38.863798 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podabc6bb30_8337_4e94_a66b_6dc1a30c3bec.slice/crio-5ac67430d88de823e082470538cb573b24faa796661789452ae187acea4eec3b WatchSource:0}: Error finding container 5ac67430d88de823e082470538cb573b24faa796661789452ae187acea4eec3b: Status 404 returned error can't find the container with id 5ac67430d88de823e082470538cb573b24faa796661789452ae187acea4eec3b Mar 07 21:36:38.922112 master-0 kubenswrapper[16352]: I0307 21:36:38.922034 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5"] Mar 07 21:36:38.927314 master-0 kubenswrapper[16352]: I0307 21:36:38.927256 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.939965 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940097 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-ovs-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940191 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gql2s\" (UniqueName: \"kubernetes.io/projected/7e58bc66-1200-424b-b5a2-d2de58a5a731-kube-api-access-gql2s\") pod \"nmstate-metrics-69594cc75-26sjk\" (UID: \"7e58bc66-1200-424b-b5a2-d2de58a5a731\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940229 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jkk5g\" (UniqueName: \"kubernetes.io/projected/a2a4ecb5-0643-497e-a485-1d73c93739de-kube-api-access-jkk5g\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940327 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aafd47a0-9f29-4048-affa-55d19cb13b7b-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940356 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-dbus-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.940513 master-0 kubenswrapper[16352]: I0307 21:36:38.940469 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-dbus-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.941315 master-0 kubenswrapper[16352]: I0307 21:36:38.941012 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-ovs-socket\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.941315 master-0 kubenswrapper[16352]: I0307 21:36:38.940335 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 07 21:36:38.941315 master-0 kubenswrapper[16352]: I0307 21:36:38.941095 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-nmstate-lock\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.941315 master-0 kubenswrapper[16352]: I0307 21:36:38.941153 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpf99\" (UniqueName: \"kubernetes.io/projected/aafd47a0-9f29-4048-affa-55d19cb13b7b-kube-api-access-wpf99\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.941565 master-0 kubenswrapper[16352]: I0307 21:36:38.941403 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/a2a4ecb5-0643-497e-a485-1d73c93739de-nmstate-lock\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.948534 master-0 kubenswrapper[16352]: I0307 21:36:38.948492 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5"] Mar 07 21:36:38.949743 master-0 kubenswrapper[16352]: I0307 21:36:38.949700 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/aafd47a0-9f29-4048-affa-55d19cb13b7b-tls-key-pair\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:38.957937 master-0 kubenswrapper[16352]: I0307 21:36:38.957876 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gql2s\" (UniqueName: \"kubernetes.io/projected/7e58bc66-1200-424b-b5a2-d2de58a5a731-kube-api-access-gql2s\") pod \"nmstate-metrics-69594cc75-26sjk\" (UID: \"7e58bc66-1200-424b-b5a2-d2de58a5a731\") " pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" Mar 07 21:36:38.963280 master-0 kubenswrapper[16352]: I0307 21:36:38.963238 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jkk5g\" (UniqueName: \"kubernetes.io/projected/a2a4ecb5-0643-497e-a485-1d73c93739de-kube-api-access-jkk5g\") pod \"nmstate-handler-9lvhn\" (UID: \"a2a4ecb5-0643-497e-a485-1d73c93739de\") " pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:38.963890 master-0 kubenswrapper[16352]: I0307 21:36:38.963857 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpf99\" (UniqueName: \"kubernetes.io/projected/aafd47a0-9f29-4048-affa-55d19cb13b7b-kube-api-access-wpf99\") pod \"nmstate-webhook-786f45cff4-lsgfs\" (UID: \"aafd47a0-9f29-4048-affa-55d19cb13b7b\") " pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:39.044372 master-0 kubenswrapper[16352]: I0307 21:36:39.042858 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b271cd-3d09-448c-8657-6e12b422eb83-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.044372 master-0 kubenswrapper[16352]: I0307 21:36:39.043040 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ldg5\" (UniqueName: \"kubernetes.io/projected/12b271cd-3d09-448c-8657-6e12b422eb83-kube-api-access-8ldg5\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.044372 master-0 kubenswrapper[16352]: I0307 21:36:39.043137 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b271cd-3d09-448c-8657-6e12b422eb83-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.053172 master-0 kubenswrapper[16352]: I0307 21:36:39.040823 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" Mar 07 21:36:39.069963 master-0 kubenswrapper[16352]: I0307 21:36:39.069863 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:39.097496 master-0 kubenswrapper[16352]: I0307 21:36:39.097378 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:39.125091 master-0 kubenswrapper[16352]: I0307 21:36:39.121522 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-5c96487ddf-5r2nd"] Mar 07 21:36:39.125091 master-0 kubenswrapper[16352]: I0307 21:36:39.123492 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.137514 master-0 kubenswrapper[16352]: I0307 21:36:39.137469 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c96487ddf-5r2nd"] Mar 07 21:36:39.163399 master-0 kubenswrapper[16352]: I0307 21:36:39.163298 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b271cd-3d09-448c-8657-6e12b422eb83-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.163608 master-0 kubenswrapper[16352]: I0307 21:36:39.163551 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b271cd-3d09-448c-8657-6e12b422eb83-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.163698 master-0 kubenswrapper[16352]: I0307 21:36:39.163667 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ldg5\" (UniqueName: \"kubernetes.io/projected/12b271cd-3d09-448c-8657-6e12b422eb83-kube-api-access-8ldg5\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.165340 master-0 kubenswrapper[16352]: I0307 21:36:39.165312 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/12b271cd-3d09-448c-8657-6e12b422eb83-nginx-conf\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.175767 master-0 kubenswrapper[16352]: I0307 21:36:39.172806 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/12b271cd-3d09-448c-8657-6e12b422eb83-plugin-serving-cert\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.184440 master-0 kubenswrapper[16352]: I0307 21:36:39.184393 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ldg5\" (UniqueName: \"kubernetes.io/projected/12b271cd-3d09-448c-8657-6e12b422eb83-kube-api-access-8ldg5\") pod \"nmstate-console-plugin-5dcbbd79cf-cbbp5\" (UID: \"12b271cd-3d09-448c-8657-6e12b422eb83\") " pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.265539 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w6jmc\" (UniqueName: \"kubernetes.io/projected/1ca959b9-f496-4b6b-aa69-762f3bdc3575-kube-api-access-w6jmc\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.265624 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-oauth-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.265689 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.265760 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-oauth-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.265789 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-service-ca\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.266280 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.269053 master-0 kubenswrapper[16352]: I0307 21:36:39.266665 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-trusted-ca-bundle\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.321381 master-0 kubenswrapper[16352]: I0307 21:36:39.321317 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" Mar 07 21:36:39.370143 master-0 kubenswrapper[16352]: I0307 21:36:39.370048 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-oauth-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370143 master-0 kubenswrapper[16352]: I0307 21:36:39.370139 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-service-ca\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370143 master-0 kubenswrapper[16352]: I0307 21:36:39.370159 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370394 master-0 kubenswrapper[16352]: I0307 21:36:39.370243 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-trusted-ca-bundle\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370394 master-0 kubenswrapper[16352]: I0307 21:36:39.370310 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w6jmc\" (UniqueName: \"kubernetes.io/projected/1ca959b9-f496-4b6b-aa69-762f3bdc3575-kube-api-access-w6jmc\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370394 master-0 kubenswrapper[16352]: I0307 21:36:39.370335 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-oauth-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.370394 master-0 kubenswrapper[16352]: I0307 21:36:39.370359 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.372008 master-0 kubenswrapper[16352]: I0307 21:36:39.371968 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.372569 master-0 kubenswrapper[16352]: I0307 21:36:39.372531 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-oauth-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.373099 master-0 kubenswrapper[16352]: I0307 21:36:39.373074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-service-ca\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.375712 master-0 kubenswrapper[16352]: I0307 21:36:39.375609 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1ca959b9-f496-4b6b-aa69-762f3bdc3575-trusted-ca-bundle\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.376654 master-0 kubenswrapper[16352]: I0307 21:36:39.376614 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-serving-cert\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.385607 master-0 kubenswrapper[16352]: I0307 21:36:39.385576 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1ca959b9-f496-4b6b-aa69-762f3bdc3575-console-oauth-config\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.450300 master-0 kubenswrapper[16352]: I0307 21:36:39.449401 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w6jmc\" (UniqueName: \"kubernetes.io/projected/1ca959b9-f496-4b6b-aa69-762f3bdc3575-kube-api-access-w6jmc\") pod \"console-5c96487ddf-5r2nd\" (UID: \"1ca959b9-f496-4b6b-aa69-762f3bdc3575\") " pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.492254 master-0 kubenswrapper[16352]: I0307 21:36:39.491785 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:39.604371 master-0 kubenswrapper[16352]: I0307 21:36:39.601012 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9lvhn" event={"ID":"a2a4ecb5-0643-497e-a485-1d73c93739de","Type":"ContainerStarted","Data":"55d59e35816b408f976f5ba04d5999decacc8a11eed5402ac1f7683e6d02cc44"} Mar 07 21:36:39.606333 master-0 kubenswrapper[16352]: I0307 21:36:39.605014 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lnt6b" event={"ID":"abc6bb30-8337-4e94-a66b-6dc1a30c3bec","Type":"ContainerStarted","Data":"c31da8d07aaaa65943a5a2b9700bdb2d5908796c7dc82fb1d25ec89782797c58"} Mar 07 21:36:39.606333 master-0 kubenswrapper[16352]: I0307 21:36:39.605082 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lnt6b" event={"ID":"abc6bb30-8337-4e94-a66b-6dc1a30c3bec","Type":"ContainerStarted","Data":"5ac67430d88de823e082470538cb573b24faa796661789452ae187acea4eec3b"} Mar 07 21:36:39.642586 master-0 kubenswrapper[16352]: I0307 21:36:39.642541 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs"] Mar 07 21:36:39.775392 master-0 kubenswrapper[16352]: I0307 21:36:39.775175 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-69594cc75-26sjk"] Mar 07 21:36:39.780420 master-0 kubenswrapper[16352]: W0307 21:36:39.780303 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7e58bc66_1200_424b_b5a2_d2de58a5a731.slice/crio-6033b62cdb29bad9c0cfe8e1f321c3acdca49c866d596c9ea4874cffaac3de30 WatchSource:0}: Error finding container 6033b62cdb29bad9c0cfe8e1f321c3acdca49c866d596c9ea4874cffaac3de30: Status 404 returned error can't find the container with id 6033b62cdb29bad9c0cfe8e1f321c3acdca49c866d596c9ea4874cffaac3de30 Mar 07 21:36:39.928457 master-0 kubenswrapper[16352]: I0307 21:36:39.928376 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5"] Mar 07 21:36:39.930382 master-0 kubenswrapper[16352]: W0307 21:36:39.930328 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12b271cd_3d09_448c_8657_6e12b422eb83.slice/crio-f6218decf7d6bb812c7556595d837dadb53693e171b2348c217e668a755b19c4 WatchSource:0}: Error finding container f6218decf7d6bb812c7556595d837dadb53693e171b2348c217e668a755b19c4: Status 404 returned error can't find the container with id f6218decf7d6bb812c7556595d837dadb53693e171b2348c217e668a755b19c4 Mar 07 21:36:40.022212 master-0 kubenswrapper[16352]: I0307 21:36:40.022140 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-5c96487ddf-5r2nd"] Mar 07 21:36:40.023065 master-0 kubenswrapper[16352]: W0307 21:36:40.022978 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1ca959b9_f496_4b6b_aa69_762f3bdc3575.slice/crio-2ba0decbc9a257634ef8c5049d787787360e6ec62d9a7cb08c107aa711806d46 WatchSource:0}: Error finding container 2ba0decbc9a257634ef8c5049d787787360e6ec62d9a7cb08c107aa711806d46: Status 404 returned error can't find the container with id 2ba0decbc9a257634ef8c5049d787787360e6ec62d9a7cb08c107aa711806d46 Mar 07 21:36:40.620746 master-0 kubenswrapper[16352]: I0307 21:36:40.620645 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-86ddb6bd46-nx428" event={"ID":"ccc2f4b9-4661-4eee-8257-f6de6f473f00","Type":"ContainerStarted","Data":"cb01cbe05127b5266ae8d8b37adb28f3898f5b1752f92460ea4d49b78858065f"} Mar 07 21:36:40.621099 master-0 kubenswrapper[16352]: I0307 21:36:40.620774 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:40.624267 master-0 kubenswrapper[16352]: I0307 21:36:40.624211 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c96487ddf-5r2nd" event={"ID":"1ca959b9-f496-4b6b-aa69-762f3bdc3575","Type":"ContainerStarted","Data":"ca8edac251c30ee92aea45f40873f7ce346157604ee9c4ec074452130703805d"} Mar 07 21:36:40.624405 master-0 kubenswrapper[16352]: I0307 21:36:40.624272 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-5c96487ddf-5r2nd" event={"ID":"1ca959b9-f496-4b6b-aa69-762f3bdc3575","Type":"ContainerStarted","Data":"2ba0decbc9a257634ef8c5049d787787360e6ec62d9a7cb08c107aa711806d46"} Mar 07 21:36:40.628079 master-0 kubenswrapper[16352]: I0307 21:36:40.628024 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-lnt6b" event={"ID":"abc6bb30-8337-4e94-a66b-6dc1a30c3bec","Type":"ContainerStarted","Data":"3ca9373223c49aabf90b2e387091f33615b6f2eb4b2edcf8f83e2c9d6c9f35b8"} Mar 07 21:36:40.628179 master-0 kubenswrapper[16352]: I0307 21:36:40.628123 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-lnt6b" Mar 07 21:36:40.629732 master-0 kubenswrapper[16352]: I0307 21:36:40.629700 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" event={"ID":"7e58bc66-1200-424b-b5a2-d2de58a5a731","Type":"ContainerStarted","Data":"6033b62cdb29bad9c0cfe8e1f321c3acdca49c866d596c9ea4874cffaac3de30"} Mar 07 21:36:40.631377 master-0 kubenswrapper[16352]: I0307 21:36:40.631343 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" event={"ID":"12b271cd-3d09-448c-8657-6e12b422eb83","Type":"ContainerStarted","Data":"f6218decf7d6bb812c7556595d837dadb53693e171b2348c217e668a755b19c4"} Mar 07 21:36:40.632705 master-0 kubenswrapper[16352]: I0307 21:36:40.632628 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" event={"ID":"aafd47a0-9f29-4048-affa-55d19cb13b7b","Type":"ContainerStarted","Data":"b63314a42eac84608c0002135c96628c45621294d3d7de78c885dc41734ca51e"} Mar 07 21:36:40.652417 master-0 kubenswrapper[16352]: I0307 21:36:40.652326 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-86ddb6bd46-nx428" podStartSLOduration=2.9144132750000002 podStartE2EDuration="4.652306049s" podCreationTimestamp="2026-03-07 21:36:36 +0000 UTC" firstStartedPulling="2026-03-07 21:36:37.942267192 +0000 UTC m=+1121.012972281" lastFinishedPulling="2026-03-07 21:36:39.680159996 +0000 UTC m=+1122.750865055" observedRunningTime="2026-03-07 21:36:40.645937338 +0000 UTC m=+1123.716642397" watchObservedRunningTime="2026-03-07 21:36:40.652306049 +0000 UTC m=+1123.723011108" Mar 07 21:36:40.671185 master-0 kubenswrapper[16352]: I0307 21:36:40.671106 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-5c96487ddf-5r2nd" podStartSLOduration=1.671089337 podStartE2EDuration="1.671089337s" podCreationTimestamp="2026-03-07 21:36:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:36:40.665695249 +0000 UTC m=+1123.736400308" watchObservedRunningTime="2026-03-07 21:36:40.671089337 +0000 UTC m=+1123.741794396" Mar 07 21:36:40.691865 master-0 kubenswrapper[16352]: I0307 21:36:40.691770 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-lnt6b" podStartSLOduration=3.662962485 podStartE2EDuration="4.69174567s" podCreationTimestamp="2026-03-07 21:36:36 +0000 UTC" firstStartedPulling="2026-03-07 21:36:39.252604139 +0000 UTC m=+1122.323309198" lastFinishedPulling="2026-03-07 21:36:40.281387314 +0000 UTC m=+1123.352092383" observedRunningTime="2026-03-07 21:36:40.68881113 +0000 UTC m=+1123.759516189" watchObservedRunningTime="2026-03-07 21:36:40.69174567 +0000 UTC m=+1123.762450729" Mar 07 21:36:46.727551 master-0 kubenswrapper[16352]: I0307 21:36:46.727431 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" event={"ID":"12b271cd-3d09-448c-8657-6e12b422eb83","Type":"ContainerStarted","Data":"769a32d0b5dd2ab450323994a96c9b75ed32cd9352f444b40a8df5897747356f"} Mar 07 21:36:46.729958 master-0 kubenswrapper[16352]: I0307 21:36:46.729822 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" event={"ID":"aafd47a0-9f29-4048-affa-55d19cb13b7b","Type":"ContainerStarted","Data":"14f75619f858f97636b6aec2be4c9414dc0aae34741ee801cb150c58f12baf51"} Mar 07 21:36:46.731644 master-0 kubenswrapper[16352]: I0307 21:36:46.731603 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:36:46.739165 master-0 kubenswrapper[16352]: I0307 21:36:46.738180 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-9lvhn" event={"ID":"a2a4ecb5-0643-497e-a485-1d73c93739de","Type":"ContainerStarted","Data":"94b17d26967ffc1136283cbf5df0a643d44e580d37d6da72e7d4ac1ed9e0814c"} Mar 07 21:36:46.739165 master-0 kubenswrapper[16352]: I0307 21:36:46.739121 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:46.747421 master-0 kubenswrapper[16352]: I0307 21:36:46.747325 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" event={"ID":"88c33979-3534-44e1-a236-519aabbb8682","Type":"ContainerStarted","Data":"8659b1c66c91ccffb2de0040871b74f3508d1047aaafae31c113d5e244334d78"} Mar 07 21:36:46.747930 master-0 kubenswrapper[16352]: I0307 21:36:46.747900 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:46.760032 master-0 kubenswrapper[16352]: I0307 21:36:46.759866 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-5dcbbd79cf-cbbp5" podStartSLOduration=3.245840374 podStartE2EDuration="8.759821929s" podCreationTimestamp="2026-03-07 21:36:38 +0000 UTC" firstStartedPulling="2026-03-07 21:36:39.933255382 +0000 UTC m=+1123.003960441" lastFinishedPulling="2026-03-07 21:36:45.447236927 +0000 UTC m=+1128.517941996" observedRunningTime="2026-03-07 21:36:46.756313406 +0000 UTC m=+1129.827018465" watchObservedRunningTime="2026-03-07 21:36:46.759821929 +0000 UTC m=+1129.830527018" Mar 07 21:36:46.765888 master-0 kubenswrapper[16352]: I0307 21:36:46.764751 16352 generic.go:334] "Generic (PLEG): container finished" podID="869f7c43-071a-4079-b7f4-5d97cf58085e" containerID="715c8f6f55e2e960aef29bf1e931377305b75d775baf0d77a9e3c24b8df5b7af" exitCode=0 Mar 07 21:36:46.765888 master-0 kubenswrapper[16352]: I0307 21:36:46.764851 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerDied","Data":"715c8f6f55e2e960aef29bf1e931377305b75d775baf0d77a9e3c24b8df5b7af"} Mar 07 21:36:46.781530 master-0 kubenswrapper[16352]: I0307 21:36:46.781482 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" event={"ID":"7e58bc66-1200-424b-b5a2-d2de58a5a731","Type":"ContainerStarted","Data":"5eb1b62a32f16a41654a6a379c34c803a768e59146bbae45d9cc9d740ad46e0a"} Mar 07 21:36:46.781735 master-0 kubenswrapper[16352]: I0307 21:36:46.781675 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" event={"ID":"7e58bc66-1200-424b-b5a2-d2de58a5a731","Type":"ContainerStarted","Data":"fad0806d5c638f8f0c4e59c52bef765ef25847eb2c9557c7189860d6168ef9bd"} Mar 07 21:36:46.796283 master-0 kubenswrapper[16352]: I0307 21:36:46.796154 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-9lvhn" podStartSLOduration=2.54580287 podStartE2EDuration="8.796122205s" podCreationTimestamp="2026-03-07 21:36:38 +0000 UTC" firstStartedPulling="2026-03-07 21:36:39.195451007 +0000 UTC m=+1122.266156066" lastFinishedPulling="2026-03-07 21:36:45.445770302 +0000 UTC m=+1128.516475401" observedRunningTime="2026-03-07 21:36:46.791068844 +0000 UTC m=+1129.861773903" watchObservedRunningTime="2026-03-07 21:36:46.796122205 +0000 UTC m=+1129.866827274" Mar 07 21:36:46.838891 master-0 kubenswrapper[16352]: I0307 21:36:46.838775 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" podStartSLOduration=2.9896325 podStartE2EDuration="10.838744252s" podCreationTimestamp="2026-03-07 21:36:36 +0000 UTC" firstStartedPulling="2026-03-07 21:36:37.601408663 +0000 UTC m=+1120.672113722" lastFinishedPulling="2026-03-07 21:36:45.450520405 +0000 UTC m=+1128.521225474" observedRunningTime="2026-03-07 21:36:46.824602324 +0000 UTC m=+1129.895307393" watchObservedRunningTime="2026-03-07 21:36:46.838744252 +0000 UTC m=+1129.909449331" Mar 07 21:36:46.849983 master-0 kubenswrapper[16352]: I0307 21:36:46.849877 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" podStartSLOduration=3.071458465 podStartE2EDuration="8.849846786s" podCreationTimestamp="2026-03-07 21:36:38 +0000 UTC" firstStartedPulling="2026-03-07 21:36:39.673870006 +0000 UTC m=+1122.744575065" lastFinishedPulling="2026-03-07 21:36:45.452258297 +0000 UTC m=+1128.522963386" observedRunningTime="2026-03-07 21:36:46.847228254 +0000 UTC m=+1129.917933323" watchObservedRunningTime="2026-03-07 21:36:46.849846786 +0000 UTC m=+1129.920551855" Mar 07 21:36:46.910126 master-0 kubenswrapper[16352]: I0307 21:36:46.907723 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-69594cc75-26sjk" podStartSLOduration=3.242059064 podStartE2EDuration="8.907665375s" podCreationTimestamp="2026-03-07 21:36:38 +0000 UTC" firstStartedPulling="2026-03-07 21:36:39.786937032 +0000 UTC m=+1122.857642101" lastFinishedPulling="2026-03-07 21:36:45.452543323 +0000 UTC m=+1128.523248412" observedRunningTime="2026-03-07 21:36:46.879396691 +0000 UTC m=+1129.950101750" watchObservedRunningTime="2026-03-07 21:36:46.907665375 +0000 UTC m=+1129.978370454" Mar 07 21:36:47.280300 master-0 kubenswrapper[16352]: I0307 21:36:47.280129 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-86ddb6bd46-nx428" Mar 07 21:36:47.798801 master-0 kubenswrapper[16352]: I0307 21:36:47.798661 16352 generic.go:334] "Generic (PLEG): container finished" podID="869f7c43-071a-4079-b7f4-5d97cf58085e" containerID="5bd8c85df58dd70bac3398a12a8b7f95f42fd401a50cd4d7d5112825f47c5209" exitCode=0 Mar 07 21:36:47.798801 master-0 kubenswrapper[16352]: I0307 21:36:47.798735 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerDied","Data":"5bd8c85df58dd70bac3398a12a8b7f95f42fd401a50cd4d7d5112825f47c5209"} Mar 07 21:36:48.816544 master-0 kubenswrapper[16352]: I0307 21:36:48.816447 16352 generic.go:334] "Generic (PLEG): container finished" podID="869f7c43-071a-4079-b7f4-5d97cf58085e" containerID="6e5489ebd90a6b70d18cb53a24b570540aca55a7518c5f22e5d4c57044932440" exitCode=0 Mar 07 21:36:48.817247 master-0 kubenswrapper[16352]: I0307 21:36:48.816939 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerDied","Data":"6e5489ebd90a6b70d18cb53a24b570540aca55a7518c5f22e5d4c57044932440"} Mar 07 21:36:49.493021 master-0 kubenswrapper[16352]: I0307 21:36:49.492377 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:49.493021 master-0 kubenswrapper[16352]: I0307 21:36:49.492576 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:49.509743 master-0 kubenswrapper[16352]: I0307 21:36:49.506589 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:49.838196 master-0 kubenswrapper[16352]: I0307 21:36:49.838123 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"dd5a8cc23904172583541cb0b1d42e803bb1d84686ce013b809fabaad5cc0fa5"} Mar 07 21:36:49.838196 master-0 kubenswrapper[16352]: I0307 21:36:49.838189 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"7e2ff75fbd542a349303bfba4921caca532a1c2dd6579a8b8dd9a43f7bd6c258"} Mar 07 21:36:49.838196 master-0 kubenswrapper[16352]: I0307 21:36:49.838205 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"d12b9576dff32f691904e077b629ebad225ad3d3e44364bdbfc8bc0caa5e0fa3"} Mar 07 21:36:49.838196 master-0 kubenswrapper[16352]: I0307 21:36:49.838222 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"afbf5bdb3f414a79a1a94be92130816d1ba2754ad25dae2569b4b063520685a1"} Mar 07 21:36:49.845848 master-0 kubenswrapper[16352]: I0307 21:36:49.845644 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-5c96487ddf-5r2nd" Mar 07 21:36:49.936810 master-0 kubenswrapper[16352]: I0307 21:36:49.936638 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:36:50.862307 master-0 kubenswrapper[16352]: I0307 21:36:50.862206 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"92a31bf51f3941c72ef0863cbe5b9e185b5f5ac8411499caf69d6ae62bcd405f"} Mar 07 21:36:50.862307 master-0 kubenswrapper[16352]: I0307 21:36:50.862292 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-9cvbt" event={"ID":"869f7c43-071a-4079-b7f4-5d97cf58085e","Type":"ContainerStarted","Data":"6da0c73befeb3250fe1b5f7c89843dba4f26b0ce07cfdd6d2b1272cafda34f07"} Mar 07 21:36:50.863239 master-0 kubenswrapper[16352]: I0307 21:36:50.862580 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:50.909535 master-0 kubenswrapper[16352]: I0307 21:36:50.909431 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-9cvbt" podStartSLOduration=7.236464464 podStartE2EDuration="14.909406337s" podCreationTimestamp="2026-03-07 21:36:36 +0000 UTC" firstStartedPulling="2026-03-07 21:36:37.852544851 +0000 UTC m=+1120.923249910" lastFinishedPulling="2026-03-07 21:36:45.525486714 +0000 UTC m=+1128.596191783" observedRunningTime="2026-03-07 21:36:50.90237651 +0000 UTC m=+1133.973081579" watchObservedRunningTime="2026-03-07 21:36:50.909406337 +0000 UTC m=+1133.980111396" Mar 07 21:36:52.685790 master-0 kubenswrapper[16352]: I0307 21:36:52.685666 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:52.737227 master-0 kubenswrapper[16352]: I0307 21:36:52.737150 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:36:54.143267 master-0 kubenswrapper[16352]: I0307 21:36:54.143167 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-9lvhn" Mar 07 21:36:57.104281 master-0 kubenswrapper[16352]: I0307 21:36:57.104185 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7f989f654f-vnw67" Mar 07 21:36:58.770107 master-0 kubenswrapper[16352]: I0307 21:36:58.770020 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-lnt6b" Mar 07 21:36:59.079789 master-0 kubenswrapper[16352]: I0307 21:36:59.079543 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-786f45cff4-lsgfs" Mar 07 21:37:04.858494 master-0 kubenswrapper[16352]: I0307 21:37:04.858384 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-storage/vg-manager-9nzbx"] Mar 07 21:37:04.860572 master-0 kubenswrapper[16352]: I0307 21:37:04.860533 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:04.863547 master-0 kubenswrapper[16352]: I0307 21:37:04.863470 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-storage"/"vg-manager-metrics-cert" Mar 07 21:37:04.874906 master-0 kubenswrapper[16352]: I0307 21:37:04.874818 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-9nzbx"] Mar 07 21:37:05.015509 master-0 kubenswrapper[16352]: I0307 21:37:05.015419 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/060408c0-237a-4176-8ec5-45070b1d8c54-metrics-cert\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.015813 master-0 kubenswrapper[16352]: I0307 21:37:05.015533 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-registration-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.015813 master-0 kubenswrapper[16352]: I0307 21:37:05.015566 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-lvmd-config\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.015813 master-0 kubenswrapper[16352]: I0307 21:37:05.015590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-run-udev\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.015813 master-0 kubenswrapper[16352]: I0307 21:37:05.015616 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-node-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.015813 master-0 kubenswrapper[16352]: I0307 21:37:05.015658 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-pod-volumes-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.016209 master-0 kubenswrapper[16352]: I0307 21:37:05.016064 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n2wm\" (UniqueName: \"kubernetes.io/projected/060408c0-237a-4176-8ec5-45070b1d8c54-kube-api-access-2n2wm\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.016659 master-0 kubenswrapper[16352]: I0307 21:37:05.016610 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-sys\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.016881 master-0 kubenswrapper[16352]: I0307 21:37:05.016837 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-file-lock-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.017147 master-0 kubenswrapper[16352]: I0307 21:37:05.017067 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-device-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.017359 master-0 kubenswrapper[16352]: I0307 21:37:05.017322 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-csi-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119429 master-0 kubenswrapper[16352]: I0307 21:37:05.119239 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-pod-volumes-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119429 master-0 kubenswrapper[16352]: I0307 21:37:05.119327 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n2wm\" (UniqueName: \"kubernetes.io/projected/060408c0-237a-4176-8ec5-45070b1d8c54-kube-api-access-2n2wm\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119429 master-0 kubenswrapper[16352]: I0307 21:37:05.119410 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-sys\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119818 master-0 kubenswrapper[16352]: I0307 21:37:05.119549 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-volumes-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-pod-volumes-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119818 master-0 kubenswrapper[16352]: I0307 21:37:05.119577 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-sys\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119818 master-0 kubenswrapper[16352]: I0307 21:37:05.119495 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-file-lock-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.119975 master-0 kubenswrapper[16352]: I0307 21:37:05.119873 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-device-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120072 master-0 kubenswrapper[16352]: I0307 21:37:05.119953 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"file-lock-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-file-lock-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120132 master-0 kubenswrapper[16352]: I0307 21:37:05.120077 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-csi-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120132 master-0 kubenswrapper[16352]: I0307 21:37:05.120027 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-device-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120224 master-0 kubenswrapper[16352]: I0307 21:37:05.120193 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/060408c0-237a-4176-8ec5-45070b1d8c54-metrics-cert\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120420 master-0 kubenswrapper[16352]: I0307 21:37:05.120388 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-csi-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120479 master-0 kubenswrapper[16352]: I0307 21:37:05.120437 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-registration-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120533 master-0 kubenswrapper[16352]: I0307 21:37:05.120508 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-lvmd-config\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120533 master-0 kubenswrapper[16352]: I0307 21:37:05.120523 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-registration-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120617 master-0 kubenswrapper[16352]: I0307 21:37:05.120572 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-run-udev\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120617 master-0 kubenswrapper[16352]: I0307 21:37:05.120604 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-node-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120764 master-0 kubenswrapper[16352]: I0307 21:37:05.120710 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-udev\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-run-udev\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120917 master-0 kubenswrapper[16352]: I0307 21:37:05.120867 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lvmd-config\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-lvmd-config\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.120969 master-0 kubenswrapper[16352]: I0307 21:37:05.120887 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-plugin-dir\" (UniqueName: \"kubernetes.io/host-path/060408c0-237a-4176-8ec5-45070b1d8c54-node-plugin-dir\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.124375 master-0 kubenswrapper[16352]: I0307 21:37:05.124322 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/060408c0-237a-4176-8ec5-45070b1d8c54-metrics-cert\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.142693 master-0 kubenswrapper[16352]: I0307 21:37:05.142599 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n2wm\" (UniqueName: \"kubernetes.io/projected/060408c0-237a-4176-8ec5-45070b1d8c54-kube-api-access-2n2wm\") pod \"vg-manager-9nzbx\" (UID: \"060408c0-237a-4176-8ec5-45070b1d8c54\") " pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.219661 master-0 kubenswrapper[16352]: I0307 21:37:05.219604 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:05.723863 master-0 kubenswrapper[16352]: I0307 21:37:05.723754 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-storage/vg-manager-9nzbx"] Mar 07 21:37:05.725752 master-0 kubenswrapper[16352]: W0307 21:37:05.725464 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod060408c0_237a_4176_8ec5_45070b1d8c54.slice/crio-a023f8cc433d4d1e06067eed6bd75720d31d944e0590c7c626e3b392750163a1 WatchSource:0}: Error finding container a023f8cc433d4d1e06067eed6bd75720d31d944e0590c7c626e3b392750163a1: Status 404 returned error can't find the container with id a023f8cc433d4d1e06067eed6bd75720d31d944e0590c7c626e3b392750163a1 Mar 07 21:37:06.066264 master-0 kubenswrapper[16352]: I0307 21:37:06.066042 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9nzbx" event={"ID":"060408c0-237a-4176-8ec5-45070b1d8c54","Type":"ContainerStarted","Data":"8533c11427ccb2f13dead9f6568136d9789f45205c0b23de16a58434fc115a00"} Mar 07 21:37:06.066264 master-0 kubenswrapper[16352]: I0307 21:37:06.066120 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9nzbx" event={"ID":"060408c0-237a-4176-8ec5-45070b1d8c54","Type":"ContainerStarted","Data":"a023f8cc433d4d1e06067eed6bd75720d31d944e0590c7c626e3b392750163a1"} Mar 07 21:37:06.096125 master-0 kubenswrapper[16352]: I0307 21:37:06.096012 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-storage/vg-manager-9nzbx" podStartSLOduration=2.095978391 podStartE2EDuration="2.095978391s" podCreationTimestamp="2026-03-07 21:37:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:37:06.09341718 +0000 UTC m=+1149.164122249" watchObservedRunningTime="2026-03-07 21:37:06.095978391 +0000 UTC m=+1149.166683490" Mar 07 21:37:07.690594 master-0 kubenswrapper[16352]: I0307 21:37:07.690493 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-9cvbt" Mar 07 21:37:08.097551 master-0 kubenswrapper[16352]: I0307 21:37:08.097434 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-9nzbx_060408c0-237a-4176-8ec5-45070b1d8c54/vg-manager/0.log" Mar 07 21:37:08.097788 master-0 kubenswrapper[16352]: I0307 21:37:08.097604 16352 generic.go:334] "Generic (PLEG): container finished" podID="060408c0-237a-4176-8ec5-45070b1d8c54" containerID="8533c11427ccb2f13dead9f6568136d9789f45205c0b23de16a58434fc115a00" exitCode=1 Mar 07 21:37:08.097788 master-0 kubenswrapper[16352]: I0307 21:37:08.097671 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9nzbx" event={"ID":"060408c0-237a-4176-8ec5-45070b1d8c54","Type":"ContainerDied","Data":"8533c11427ccb2f13dead9f6568136d9789f45205c0b23de16a58434fc115a00"} Mar 07 21:37:08.098902 master-0 kubenswrapper[16352]: I0307 21:37:08.098856 16352 scope.go:117] "RemoveContainer" containerID="8533c11427ccb2f13dead9f6568136d9789f45205c0b23de16a58434fc115a00" Mar 07 21:37:08.523098 master-0 kubenswrapper[16352]: I0307 21:37:08.523027 16352 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock" Mar 07 21:37:09.116227 master-0 kubenswrapper[16352]: I0307 21:37:09.116159 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-storage_vg-manager-9nzbx_060408c0-237a-4176-8ec5-45070b1d8c54/vg-manager/0.log" Mar 07 21:37:09.116888 master-0 kubenswrapper[16352]: I0307 21:37:09.116239 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-storage/vg-manager-9nzbx" event={"ID":"060408c0-237a-4176-8ec5-45070b1d8c54","Type":"ContainerStarted","Data":"e9a4e91a16e1673d2087624b8d17a0e8606ad28832814cba80cbca427563dd5c"} Mar 07 21:37:09.298927 master-0 kubenswrapper[16352]: I0307 21:37:09.298490 16352 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/topolvm.io-reg.sock","Timestamp":"2026-03-07T21:37:08.523077402Z","Handler":null,"Name":""} Mar 07 21:37:09.302221 master-0 kubenswrapper[16352]: I0307 21:37:09.302156 16352 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: topolvm.io endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock versions: 1.0.0 Mar 07 21:37:09.302221 master-0 kubenswrapper[16352]: I0307 21:37:09.302219 16352 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: topolvm.io at endpoint: /var/lib/kubelet/plugins/topolvm.io/node/csi-topolvm.sock Mar 07 21:37:14.998258 master-0 kubenswrapper[16352]: I0307 21:37:14.998110 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-6594fcb745-7lf8n" podUID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" containerName="console" containerID="cri-o://246b22ec02344c71757e8b8be4f003ccb468a1824adbbd2e754cfa9692d708d1" gracePeriod=15 Mar 07 21:37:15.216836 master-0 kubenswrapper[16352]: I0307 21:37:15.216744 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6594fcb745-7lf8n_f2cb093b-e5fc-4408-8fdf-8b72dfc80385/console/0.log" Mar 07 21:37:15.217206 master-0 kubenswrapper[16352]: I0307 21:37:15.216901 16352 generic.go:334] "Generic (PLEG): container finished" podID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" containerID="246b22ec02344c71757e8b8be4f003ccb468a1824adbbd2e754cfa9692d708d1" exitCode=2 Mar 07 21:37:15.217206 master-0 kubenswrapper[16352]: I0307 21:37:15.216953 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6594fcb745-7lf8n" event={"ID":"f2cb093b-e5fc-4408-8fdf-8b72dfc80385","Type":"ContainerDied","Data":"246b22ec02344c71757e8b8be4f003ccb468a1824adbbd2e754cfa9692d708d1"} Mar 07 21:37:15.220756 master-0 kubenswrapper[16352]: I0307 21:37:15.220585 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:15.223901 master-0 kubenswrapper[16352]: I0307 21:37:15.223838 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:15.646051 master-0 kubenswrapper[16352]: I0307 21:37:15.645977 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6594fcb745-7lf8n_f2cb093b-e5fc-4408-8fdf-8b72dfc80385/console/0.log" Mar 07 21:37:15.646602 master-0 kubenswrapper[16352]: I0307 21:37:15.646110 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:37:15.707733 master-0 kubenswrapper[16352]: I0307 21:37:15.707616 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.707733 master-0 kubenswrapper[16352]: I0307 21:37:15.707734 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.708343 master-0 kubenswrapper[16352]: I0307 21:37:15.707770 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mdh29\" (UniqueName: \"kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.708343 master-0 kubenswrapper[16352]: I0307 21:37:15.707883 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.708343 master-0 kubenswrapper[16352]: I0307 21:37:15.707949 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.708343 master-0 kubenswrapper[16352]: I0307 21:37:15.708000 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.708343 master-0 kubenswrapper[16352]: I0307 21:37:15.708086 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle\") pod \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\" (UID: \"f2cb093b-e5fc-4408-8fdf-8b72dfc80385\") " Mar 07 21:37:15.709009 master-0 kubenswrapper[16352]: I0307 21:37:15.708664 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:37:15.710743 master-0 kubenswrapper[16352]: I0307 21:37:15.709545 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:37:15.710743 master-0 kubenswrapper[16352]: I0307 21:37:15.710133 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca" (OuterVolumeSpecName: "service-ca") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:37:15.710743 master-0 kubenswrapper[16352]: I0307 21:37:15.710415 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config" (OuterVolumeSpecName: "console-config") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:37:15.715577 master-0 kubenswrapper[16352]: I0307 21:37:15.713186 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:37:15.715577 master-0 kubenswrapper[16352]: I0307 21:37:15.714652 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:37:15.717502 master-0 kubenswrapper[16352]: I0307 21:37:15.717071 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29" (OuterVolumeSpecName: "kube-api-access-mdh29") pod "f2cb093b-e5fc-4408-8fdf-8b72dfc80385" (UID: "f2cb093b-e5fc-4408-8fdf-8b72dfc80385"). InnerVolumeSpecName "kube-api-access-mdh29". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.809971 16352 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810038 16352 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-service-ca\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810051 16352 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810062 16352 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-trusted-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810071 16352 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-oauth-serving-cert\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810080 16352 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-console-oauth-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:15.810152 master-0 kubenswrapper[16352]: I0307 21:37:15.810091 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mdh29\" (UniqueName: \"kubernetes.io/projected/f2cb093b-e5fc-4408-8fdf-8b72dfc80385-kube-api-access-mdh29\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:16.237875 master-0 kubenswrapper[16352]: I0307 21:37:16.237140 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6594fcb745-7lf8n_f2cb093b-e5fc-4408-8fdf-8b72dfc80385/console/0.log" Mar 07 21:37:16.238881 master-0 kubenswrapper[16352]: I0307 21:37:16.237994 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6594fcb745-7lf8n" Mar 07 21:37:16.238881 master-0 kubenswrapper[16352]: I0307 21:37:16.238044 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6594fcb745-7lf8n" event={"ID":"f2cb093b-e5fc-4408-8fdf-8b72dfc80385","Type":"ContainerDied","Data":"876fc01ff43d1820c5b0772826f314253e844a17904f05511b33f8c9626b2f29"} Mar 07 21:37:16.238881 master-0 kubenswrapper[16352]: I0307 21:37:16.238176 16352 scope.go:117] "RemoveContainer" containerID="246b22ec02344c71757e8b8be4f003ccb468a1824adbbd2e754cfa9692d708d1" Mar 07 21:37:16.239109 master-0 kubenswrapper[16352]: I0307 21:37:16.239030 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:16.241133 master-0 kubenswrapper[16352]: I0307 21:37:16.241073 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-storage/vg-manager-9nzbx" Mar 07 21:37:16.333131 master-0 kubenswrapper[16352]: I0307 21:37:16.327489 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:37:16.342263 master-0 kubenswrapper[16352]: I0307 21:37:16.342191 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6594fcb745-7lf8n"] Mar 07 21:37:17.216764 master-0 kubenswrapper[16352]: I0307 21:37:17.213899 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" path="/var/lib/kubelet/pods/f2cb093b-e5fc-4408-8fdf-8b72dfc80385/volumes" Mar 07 21:37:18.487709 master-0 kubenswrapper[16352]: I0307 21:37:18.486645 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:18.487709 master-0 kubenswrapper[16352]: E0307 21:37:18.487184 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" containerName="console" Mar 07 21:37:18.487709 master-0 kubenswrapper[16352]: I0307 21:37:18.487202 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" containerName="console" Mar 07 21:37:18.487709 master-0 kubenswrapper[16352]: I0307 21:37:18.487373 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2cb093b-e5fc-4408-8fdf-8b72dfc80385" containerName="console" Mar 07 21:37:18.488531 master-0 kubenswrapper[16352]: I0307 21:37:18.488010 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:18.494707 master-0 kubenswrapper[16352]: I0307 21:37:18.493060 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 07 21:37:18.494707 master-0 kubenswrapper[16352]: I0307 21:37:18.493296 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 07 21:37:18.550170 master-0 kubenswrapper[16352]: I0307 21:37:18.540621 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:18.587028 master-0 kubenswrapper[16352]: I0307 21:37:18.579894 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wr4pp\" (UniqueName: \"kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp\") pod \"openstack-operator-index-zt56c\" (UID: \"e849276d-9145-4c9b-a423-f45a8db0cb25\") " pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:18.681729 master-0 kubenswrapper[16352]: I0307 21:37:18.681625 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wr4pp\" (UniqueName: \"kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp\") pod \"openstack-operator-index-zt56c\" (UID: \"e849276d-9145-4c9b-a423-f45a8db0cb25\") " pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:18.702541 master-0 kubenswrapper[16352]: I0307 21:37:18.702490 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wr4pp\" (UniqueName: \"kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp\") pod \"openstack-operator-index-zt56c\" (UID: \"e849276d-9145-4c9b-a423-f45a8db0cb25\") " pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:18.848930 master-0 kubenswrapper[16352]: I0307 21:37:18.848846 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:19.320416 master-0 kubenswrapper[16352]: W0307 21:37:19.319754 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode849276d_9145_4c9b_a423_f45a8db0cb25.slice/crio-b29385829fb82ea191fb154b95485d35eaa3ec5518ace6acb61ac79f652632b0 WatchSource:0}: Error finding container b29385829fb82ea191fb154b95485d35eaa3ec5518ace6acb61ac79f652632b0: Status 404 returned error can't find the container with id b29385829fb82ea191fb154b95485d35eaa3ec5518ace6acb61ac79f652632b0 Mar 07 21:37:19.332926 master-0 kubenswrapper[16352]: I0307 21:37:19.332818 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:20.300476 master-0 kubenswrapper[16352]: I0307 21:37:20.300373 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zt56c" event={"ID":"e849276d-9145-4c9b-a423-f45a8db0cb25","Type":"ContainerStarted","Data":"b29385829fb82ea191fb154b95485d35eaa3ec5518ace6acb61ac79f652632b0"} Mar 07 21:37:21.317969 master-0 kubenswrapper[16352]: I0307 21:37:21.317880 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zt56c" event={"ID":"e849276d-9145-4c9b-a423-f45a8db0cb25","Type":"ContainerStarted","Data":"282c140eb305e13eb602ba5d73fdec54d3b3cf03a2ddb27c8195fddbdda7c5e3"} Mar 07 21:37:21.350239 master-0 kubenswrapper[16352]: I0307 21:37:21.350108 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-zt56c" podStartSLOduration=2.386405383 podStartE2EDuration="3.350076895s" podCreationTimestamp="2026-03-07 21:37:18 +0000 UTC" firstStartedPulling="2026-03-07 21:37:19.323212789 +0000 UTC m=+1162.393917858" lastFinishedPulling="2026-03-07 21:37:20.286884301 +0000 UTC m=+1163.357589370" observedRunningTime="2026-03-07 21:37:21.339851761 +0000 UTC m=+1164.410556880" watchObservedRunningTime="2026-03-07 21:37:21.350076895 +0000 UTC m=+1164.420781974" Mar 07 21:37:22.590932 master-0 kubenswrapper[16352]: I0307 21:37:22.590827 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:23.243849 master-0 kubenswrapper[16352]: I0307 21:37:23.243767 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-klqvq"] Mar 07 21:37:23.245669 master-0 kubenswrapper[16352]: I0307 21:37:23.245614 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:23.257147 master-0 kubenswrapper[16352]: I0307 21:37:23.257069 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-klqvq"] Mar 07 21:37:23.344621 master-0 kubenswrapper[16352]: I0307 21:37:23.344445 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-zt56c" podUID="e849276d-9145-4c9b-a423-f45a8db0cb25" containerName="registry-server" containerID="cri-o://282c140eb305e13eb602ba5d73fdec54d3b3cf03a2ddb27c8195fddbdda7c5e3" gracePeriod=2 Mar 07 21:37:23.395313 master-0 kubenswrapper[16352]: I0307 21:37:23.395195 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kn6rr\" (UniqueName: \"kubernetes.io/projected/6cd7e018-1ab7-4803-921f-f7c17fab5263-kube-api-access-kn6rr\") pod \"openstack-operator-index-klqvq\" (UID: \"6cd7e018-1ab7-4803-921f-f7c17fab5263\") " pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:23.497989 master-0 kubenswrapper[16352]: I0307 21:37:23.497783 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kn6rr\" (UniqueName: \"kubernetes.io/projected/6cd7e018-1ab7-4803-921f-f7c17fab5263-kube-api-access-kn6rr\") pod \"openstack-operator-index-klqvq\" (UID: \"6cd7e018-1ab7-4803-921f-f7c17fab5263\") " pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:23.532605 master-0 kubenswrapper[16352]: I0307 21:37:23.532494 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kn6rr\" (UniqueName: \"kubernetes.io/projected/6cd7e018-1ab7-4803-921f-f7c17fab5263-kube-api-access-kn6rr\") pod \"openstack-operator-index-klqvq\" (UID: \"6cd7e018-1ab7-4803-921f-f7c17fab5263\") " pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:23.578105 master-0 kubenswrapper[16352]: I0307 21:37:23.577988 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:24.510070 master-0 kubenswrapper[16352]: I0307 21:37:24.507659 16352 generic.go:334] "Generic (PLEG): container finished" podID="e849276d-9145-4c9b-a423-f45a8db0cb25" containerID="282c140eb305e13eb602ba5d73fdec54d3b3cf03a2ddb27c8195fddbdda7c5e3" exitCode=0 Mar 07 21:37:24.510070 master-0 kubenswrapper[16352]: I0307 21:37:24.507808 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zt56c" event={"ID":"e849276d-9145-4c9b-a423-f45a8db0cb25","Type":"ContainerDied","Data":"282c140eb305e13eb602ba5d73fdec54d3b3cf03a2ddb27c8195fddbdda7c5e3"} Mar 07 21:37:24.880635 master-0 kubenswrapper[16352]: I0307 21:37:24.880551 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:24.974649 master-0 kubenswrapper[16352]: I0307 21:37:24.974346 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wr4pp\" (UniqueName: \"kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp\") pod \"e849276d-9145-4c9b-a423-f45a8db0cb25\" (UID: \"e849276d-9145-4c9b-a423-f45a8db0cb25\") " Mar 07 21:37:24.978718 master-0 kubenswrapper[16352]: I0307 21:37:24.978612 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp" (OuterVolumeSpecName: "kube-api-access-wr4pp") pod "e849276d-9145-4c9b-a423-f45a8db0cb25" (UID: "e849276d-9145-4c9b-a423-f45a8db0cb25"). InnerVolumeSpecName "kube-api-access-wr4pp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:37:25.009141 master-0 kubenswrapper[16352]: I0307 21:37:25.009020 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-klqvq"] Mar 07 21:37:25.014020 master-0 kubenswrapper[16352]: W0307 21:37:25.013947 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd7e018_1ab7_4803_921f_f7c17fab5263.slice/crio-470b5bda3ddd2e15fe428bf1e565a0f0c101587d37e4731bd120f1f2e8908c69 WatchSource:0}: Error finding container 470b5bda3ddd2e15fe428bf1e565a0f0c101587d37e4731bd120f1f2e8908c69: Status 404 returned error can't find the container with id 470b5bda3ddd2e15fe428bf1e565a0f0c101587d37e4731bd120f1f2e8908c69 Mar 07 21:37:25.077627 master-0 kubenswrapper[16352]: I0307 21:37:25.077469 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wr4pp\" (UniqueName: \"kubernetes.io/projected/e849276d-9145-4c9b-a423-f45a8db0cb25-kube-api-access-wr4pp\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:25.531886 master-0 kubenswrapper[16352]: I0307 21:37:25.531761 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-zt56c" event={"ID":"e849276d-9145-4c9b-a423-f45a8db0cb25","Type":"ContainerDied","Data":"b29385829fb82ea191fb154b95485d35eaa3ec5518ace6acb61ac79f652632b0"} Mar 07 21:37:25.532600 master-0 kubenswrapper[16352]: I0307 21:37:25.531905 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-zt56c" Mar 07 21:37:25.532600 master-0 kubenswrapper[16352]: I0307 21:37:25.531976 16352 scope.go:117] "RemoveContainer" containerID="282c140eb305e13eb602ba5d73fdec54d3b3cf03a2ddb27c8195fddbdda7c5e3" Mar 07 21:37:25.535761 master-0 kubenswrapper[16352]: I0307 21:37:25.535611 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-klqvq" event={"ID":"6cd7e018-1ab7-4803-921f-f7c17fab5263","Type":"ContainerStarted","Data":"470b5bda3ddd2e15fe428bf1e565a0f0c101587d37e4731bd120f1f2e8908c69"} Mar 07 21:37:25.586097 master-0 kubenswrapper[16352]: I0307 21:37:25.585989 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:25.604225 master-0 kubenswrapper[16352]: I0307 21:37:25.604107 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-zt56c"] Mar 07 21:37:26.553983 master-0 kubenswrapper[16352]: I0307 21:37:26.553893 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-klqvq" event={"ID":"6cd7e018-1ab7-4803-921f-f7c17fab5263","Type":"ContainerStarted","Data":"036a4c411b6ddfab07c7706fab7352ec06648776cfb48b9e53bc91ce8b745cf9"} Mar 07 21:37:26.592214 master-0 kubenswrapper[16352]: I0307 21:37:26.592057 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-klqvq" podStartSLOduration=3.078304432 podStartE2EDuration="3.592019693s" podCreationTimestamp="2026-03-07 21:37:23 +0000 UTC" firstStartedPulling="2026-03-07 21:37:25.019604354 +0000 UTC m=+1168.090309413" lastFinishedPulling="2026-03-07 21:37:25.533319575 +0000 UTC m=+1168.604024674" observedRunningTime="2026-03-07 21:37:26.576564454 +0000 UTC m=+1169.647269553" watchObservedRunningTime="2026-03-07 21:37:26.592019693 +0000 UTC m=+1169.662724812" Mar 07 21:37:27.215789 master-0 kubenswrapper[16352]: I0307 21:37:27.215484 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e849276d-9145-4c9b-a423-f45a8db0cb25" path="/var/lib/kubelet/pods/e849276d-9145-4c9b-a423-f45a8db0cb25/volumes" Mar 07 21:37:33.579508 master-0 kubenswrapper[16352]: I0307 21:37:33.579377 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:33.579508 master-0 kubenswrapper[16352]: I0307 21:37:33.579506 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:33.622278 master-0 kubenswrapper[16352]: I0307 21:37:33.622182 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:33.684569 master-0 kubenswrapper[16352]: I0307 21:37:33.684468 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-klqvq" Mar 07 21:37:47.501285 master-0 kubenswrapper[16352]: I0307 21:37:47.501130 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5"] Mar 07 21:37:47.504468 master-0 kubenswrapper[16352]: E0307 21:37:47.502098 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e849276d-9145-4c9b-a423-f45a8db0cb25" containerName="registry-server" Mar 07 21:37:47.504468 master-0 kubenswrapper[16352]: I0307 21:37:47.502236 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e849276d-9145-4c9b-a423-f45a8db0cb25" containerName="registry-server" Mar 07 21:37:47.504468 master-0 kubenswrapper[16352]: I0307 21:37:47.502645 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e849276d-9145-4c9b-a423-f45a8db0cb25" containerName="registry-server" Mar 07 21:37:47.505570 master-0 kubenswrapper[16352]: I0307 21:37:47.505502 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.519330 master-0 kubenswrapper[16352]: I0307 21:37:47.519212 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5"] Mar 07 21:37:47.702437 master-0 kubenswrapper[16352]: I0307 21:37:47.702336 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.702437 master-0 kubenswrapper[16352]: I0307 21:37:47.702431 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.702960 master-0 kubenswrapper[16352]: I0307 21:37:47.702482 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qmcsc\" (UniqueName: \"kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.805703 master-0 kubenswrapper[16352]: I0307 21:37:47.805511 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.805703 master-0 kubenswrapper[16352]: I0307 21:37:47.805597 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.806411 master-0 kubenswrapper[16352]: I0307 21:37:47.806328 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qmcsc\" (UniqueName: \"kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.806498 master-0 kubenswrapper[16352]: I0307 21:37:47.806410 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.806498 master-0 kubenswrapper[16352]: I0307 21:37:47.806370 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.836428 master-0 kubenswrapper[16352]: I0307 21:37:47.836347 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qmcsc\" (UniqueName: \"kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc\") pod \"0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:47.846895 master-0 kubenswrapper[16352]: I0307 21:37:47.846828 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:48.317201 master-0 kubenswrapper[16352]: I0307 21:37:48.317111 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5"] Mar 07 21:37:48.320459 master-0 kubenswrapper[16352]: W0307 21:37:48.320348 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31047764_b8ab_47b5_a0d2_0cf0f320e4d3.slice/crio-402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b WatchSource:0}: Error finding container 402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b: Status 404 returned error can't find the container with id 402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b Mar 07 21:37:48.790809 master-0 kubenswrapper[16352]: E0307 21:37:48.790735 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31047764_b8ab_47b5_a0d2_0cf0f320e4d3.slice/crio-b65e09d7cc5d65f22731db4442478f4f9dc6b4160d8a58836b4cfaded6645299.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod31047764_b8ab_47b5_a0d2_0cf0f320e4d3.slice/crio-conmon-b65e09d7cc5d65f22731db4442478f4f9dc6b4160d8a58836b4cfaded6645299.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:37:48.838874 master-0 kubenswrapper[16352]: I0307 21:37:48.838709 16352 generic.go:334] "Generic (PLEG): container finished" podID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerID="b65e09d7cc5d65f22731db4442478f4f9dc6b4160d8a58836b4cfaded6645299" exitCode=0 Mar 07 21:37:48.838874 master-0 kubenswrapper[16352]: I0307 21:37:48.838799 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" event={"ID":"31047764-b8ab-47b5-a0d2-0cf0f320e4d3","Type":"ContainerDied","Data":"b65e09d7cc5d65f22731db4442478f4f9dc6b4160d8a58836b4cfaded6645299"} Mar 07 21:37:48.838874 master-0 kubenswrapper[16352]: I0307 21:37:48.838843 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" event={"ID":"31047764-b8ab-47b5-a0d2-0cf0f320e4d3","Type":"ContainerStarted","Data":"402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b"} Mar 07 21:37:49.861535 master-0 kubenswrapper[16352]: I0307 21:37:49.861456 16352 generic.go:334] "Generic (PLEG): container finished" podID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerID="3eedc0bad81831422c05b390081848a6cccfda938d8ecbd529cf5c783dcc755a" exitCode=0 Mar 07 21:37:49.863027 master-0 kubenswrapper[16352]: I0307 21:37:49.861560 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" event={"ID":"31047764-b8ab-47b5-a0d2-0cf0f320e4d3","Type":"ContainerDied","Data":"3eedc0bad81831422c05b390081848a6cccfda938d8ecbd529cf5c783dcc755a"} Mar 07 21:37:50.888207 master-0 kubenswrapper[16352]: I0307 21:37:50.888105 16352 generic.go:334] "Generic (PLEG): container finished" podID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerID="b31196a49de1fbeea47a77917221c5244341007a0b6a39ee25939be846259612" exitCode=0 Mar 07 21:37:50.889188 master-0 kubenswrapper[16352]: I0307 21:37:50.888219 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" event={"ID":"31047764-b8ab-47b5-a0d2-0cf0f320e4d3","Type":"ContainerDied","Data":"b31196a49de1fbeea47a77917221c5244341007a0b6a39ee25939be846259612"} Mar 07 21:37:52.390914 master-0 kubenswrapper[16352]: I0307 21:37:52.390852 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:52.455821 master-0 kubenswrapper[16352]: I0307 21:37:52.455092 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qmcsc\" (UniqueName: \"kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc\") pod \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " Mar 07 21:37:52.455821 master-0 kubenswrapper[16352]: I0307 21:37:52.455233 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle\") pod \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " Mar 07 21:37:52.455821 master-0 kubenswrapper[16352]: I0307 21:37:52.455347 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util\") pod \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\" (UID: \"31047764-b8ab-47b5-a0d2-0cf0f320e4d3\") " Mar 07 21:37:52.456375 master-0 kubenswrapper[16352]: I0307 21:37:52.456304 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle" (OuterVolumeSpecName: "bundle") pod "31047764-b8ab-47b5-a0d2-0cf0f320e4d3" (UID: "31047764-b8ab-47b5-a0d2-0cf0f320e4d3"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:37:52.471544 master-0 kubenswrapper[16352]: I0307 21:37:52.469381 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc" (OuterVolumeSpecName: "kube-api-access-qmcsc") pod "31047764-b8ab-47b5-a0d2-0cf0f320e4d3" (UID: "31047764-b8ab-47b5-a0d2-0cf0f320e4d3"). InnerVolumeSpecName "kube-api-access-qmcsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:37:52.475100 master-0 kubenswrapper[16352]: I0307 21:37:52.475014 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util" (OuterVolumeSpecName: "util") pod "31047764-b8ab-47b5-a0d2-0cf0f320e4d3" (UID: "31047764-b8ab-47b5-a0d2-0cf0f320e4d3"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:37:52.556881 master-0 kubenswrapper[16352]: I0307 21:37:52.556620 16352 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-util\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:52.558002 master-0 kubenswrapper[16352]: I0307 21:37:52.557950 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qmcsc\" (UniqueName: \"kubernetes.io/projected/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-kube-api-access-qmcsc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:52.558002 master-0 kubenswrapper[16352]: I0307 21:37:52.557981 16352 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/31047764-b8ab-47b5-a0d2-0cf0f320e4d3-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:37:52.921337 master-0 kubenswrapper[16352]: I0307 21:37:52.921253 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" event={"ID":"31047764-b8ab-47b5-a0d2-0cf0f320e4d3","Type":"ContainerDied","Data":"402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b"} Mar 07 21:37:52.921337 master-0 kubenswrapper[16352]: I0307 21:37:52.921324 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="402266308bc399a5528807ddd297965584c78b74a5b8f49b7b030cca5b11a80b" Mar 07 21:37:52.921693 master-0 kubenswrapper[16352]: I0307 21:37:52.921447 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/0183f44be967a8d69ee94383c30042c5e53a5fa4a88b2bb48556d11f99mmjd5" Mar 07 21:37:56.258346 master-0 kubenswrapper[16352]: I0307 21:37:56.258272 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp"] Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: E0307 21:37:56.258706 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="pull" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: I0307 21:37:56.258721 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="pull" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: E0307 21:37:56.258733 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="util" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: I0307 21:37:56.258740 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="util" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: E0307 21:37:56.258757 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="extract" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: I0307 21:37:56.258763 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="extract" Mar 07 21:37:56.259203 master-0 kubenswrapper[16352]: I0307 21:37:56.258956 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="31047764-b8ab-47b5-a0d2-0cf0f320e4d3" containerName="extract" Mar 07 21:37:56.262453 master-0 kubenswrapper[16352]: I0307 21:37:56.259533 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:37:56.290905 master-0 kubenswrapper[16352]: I0307 21:37:56.290823 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp"] Mar 07 21:37:56.438924 master-0 kubenswrapper[16352]: I0307 21:37:56.438857 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqvmc\" (UniqueName: \"kubernetes.io/projected/9fa58a7c-af63-4d16-ae66-316130305053-kube-api-access-sqvmc\") pod \"openstack-operator-controller-init-6f44f7b99f-fplrp\" (UID: \"9fa58a7c-af63-4d16-ae66-316130305053\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:37:56.541417 master-0 kubenswrapper[16352]: I0307 21:37:56.541220 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sqvmc\" (UniqueName: \"kubernetes.io/projected/9fa58a7c-af63-4d16-ae66-316130305053-kube-api-access-sqvmc\") pod \"openstack-operator-controller-init-6f44f7b99f-fplrp\" (UID: \"9fa58a7c-af63-4d16-ae66-316130305053\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:37:56.561365 master-0 kubenswrapper[16352]: I0307 21:37:56.561302 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqvmc\" (UniqueName: \"kubernetes.io/projected/9fa58a7c-af63-4d16-ae66-316130305053-kube-api-access-sqvmc\") pod \"openstack-operator-controller-init-6f44f7b99f-fplrp\" (UID: \"9fa58a7c-af63-4d16-ae66-316130305053\") " pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:37:56.582494 master-0 kubenswrapper[16352]: I0307 21:37:56.582432 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:37:57.070885 master-0 kubenswrapper[16352]: I0307 21:37:57.070806 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp"] Mar 07 21:37:57.980214 master-0 kubenswrapper[16352]: I0307 21:37:57.980118 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" event={"ID":"9fa58a7c-af63-4d16-ae66-316130305053","Type":"ContainerStarted","Data":"82ac123442dc29ecd668551d612ebbb7b6f6b9e82c5cdb9ffc814d7bf05197b8"} Mar 07 21:38:02.049854 master-0 kubenswrapper[16352]: I0307 21:38:02.049595 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" event={"ID":"9fa58a7c-af63-4d16-ae66-316130305053","Type":"ContainerStarted","Data":"e4db9d4b2785bc2d3ef86f3abadacb0c50dd74da2f3fe442732420b51d89e297"} Mar 07 21:38:02.050453 master-0 kubenswrapper[16352]: I0307 21:38:02.049874 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:38:02.122787 master-0 kubenswrapper[16352]: I0307 21:38:02.119858 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" podStartSLOduration=1.500026046 podStartE2EDuration="6.119827477s" podCreationTimestamp="2026-03-07 21:37:56 +0000 UTC" firstStartedPulling="2026-03-07 21:37:57.08126457 +0000 UTC m=+1200.151969659" lastFinishedPulling="2026-03-07 21:38:01.701065971 +0000 UTC m=+1204.771771090" observedRunningTime="2026-03-07 21:38:02.113861585 +0000 UTC m=+1205.184566644" watchObservedRunningTime="2026-03-07 21:38:02.119827477 +0000 UTC m=+1205.190532536" Mar 07 21:38:06.587884 master-0 kubenswrapper[16352]: I0307 21:38:06.587797 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-6f44f7b99f-fplrp" Mar 07 21:38:27.221324 master-0 kubenswrapper[16352]: I0307 21:38:27.218385 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq"] Mar 07 21:38:27.221324 master-0 kubenswrapper[16352]: I0307 21:38:27.220176 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:27.239718 master-0 kubenswrapper[16352]: I0307 21:38:27.236746 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h"] Mar 07 21:38:27.239718 master-0 kubenswrapper[16352]: I0307 21:38:27.238081 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:27.261373 master-0 kubenswrapper[16352]: I0307 21:38:27.260292 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq"] Mar 07 21:38:27.284198 master-0 kubenswrapper[16352]: I0307 21:38:27.284139 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22"] Mar 07 21:38:27.290782 master-0 kubenswrapper[16352]: I0307 21:38:27.286060 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:27.302797 master-0 kubenswrapper[16352]: I0307 21:38:27.299048 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h"] Mar 07 21:38:27.316808 master-0 kubenswrapper[16352]: I0307 21:38:27.316735 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22"] Mar 07 21:38:27.350778 master-0 kubenswrapper[16352]: I0307 21:38:27.349531 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x"] Mar 07 21:38:27.351428 master-0 kubenswrapper[16352]: I0307 21:38:27.351401 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:27.363745 master-0 kubenswrapper[16352]: I0307 21:38:27.363678 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mjt\" (UniqueName: \"kubernetes.io/projected/b4e76285-f40c-4114-b9f1-fcf795be20cc-kube-api-access-l2mjt\") pod \"cinder-operator-controller-manager-55d77d7b5c-hjt7h\" (UID: \"b4e76285-f40c-4114-b9f1-fcf795be20cc\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:27.364053 master-0 kubenswrapper[16352]: I0307 21:38:27.363821 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhm9v\" (UniqueName: \"kubernetes.io/projected/9edc7b0f-b45b-4110-bb13-ea19ea3442c0-kube-api-access-lhm9v\") pod \"barbican-operator-controller-manager-6db6876945-nlssq\" (UID: \"9edc7b0f-b45b-4110-bb13-ea19ea3442c0\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:27.382364 master-0 kubenswrapper[16352]: I0307 21:38:27.382301 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7"] Mar 07 21:38:27.396422 master-0 kubenswrapper[16352]: I0307 21:38:27.387016 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:27.474974 master-0 kubenswrapper[16352]: I0307 21:38:27.474668 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7"] Mar 07 21:38:27.475286 master-0 kubenswrapper[16352]: I0307 21:38:27.475156 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsggk\" (UniqueName: \"kubernetes.io/projected/303ed055-819e-4c32-8f99-7feec7ded5e3-kube-api-access-gsggk\") pod \"glance-operator-controller-manager-64db6967f8-mq69x\" (UID: \"303ed055-819e-4c32-8f99-7feec7ded5e3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:27.475286 master-0 kubenswrapper[16352]: I0307 21:38:27.475239 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhm9v\" (UniqueName: \"kubernetes.io/projected/9edc7b0f-b45b-4110-bb13-ea19ea3442c0-kube-api-access-lhm9v\") pod \"barbican-operator-controller-manager-6db6876945-nlssq\" (UID: \"9edc7b0f-b45b-4110-bb13-ea19ea3442c0\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:27.475362 master-0 kubenswrapper[16352]: I0307 21:38:27.475335 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqrlk\" (UniqueName: \"kubernetes.io/projected/6d37d2c9-97c2-49ea-9629-aa17591de886-kube-api-access-jqrlk\") pod \"heat-operator-controller-manager-cf99c678f-qmcr7\" (UID: \"6d37d2c9-97c2-49ea-9629-aa17591de886\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:27.475797 master-0 kubenswrapper[16352]: I0307 21:38:27.475764 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mjt\" (UniqueName: \"kubernetes.io/projected/b4e76285-f40c-4114-b9f1-fcf795be20cc-kube-api-access-l2mjt\") pod \"cinder-operator-controller-manager-55d77d7b5c-hjt7h\" (UID: \"b4e76285-f40c-4114-b9f1-fcf795be20cc\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:27.476174 master-0 kubenswrapper[16352]: I0307 21:38:27.476121 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmqrz\" (UniqueName: \"kubernetes.io/projected/dcba24dc-efc6-4686-9782-ac72720cbe35-kube-api-access-kmqrz\") pod \"designate-operator-controller-manager-5d87c9d997-jzt22\" (UID: \"dcba24dc-efc6-4686-9782-ac72720cbe35\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:27.533338 master-0 kubenswrapper[16352]: I0307 21:38:27.533264 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x"] Mar 07 21:38:27.536803 master-0 kubenswrapper[16352]: I0307 21:38:27.535723 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mjt\" (UniqueName: \"kubernetes.io/projected/b4e76285-f40c-4114-b9f1-fcf795be20cc-kube-api-access-l2mjt\") pod \"cinder-operator-controller-manager-55d77d7b5c-hjt7h\" (UID: \"b4e76285-f40c-4114-b9f1-fcf795be20cc\") " pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:27.545009 master-0 kubenswrapper[16352]: I0307 21:38:27.544500 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhm9v\" (UniqueName: \"kubernetes.io/projected/9edc7b0f-b45b-4110-bb13-ea19ea3442c0-kube-api-access-lhm9v\") pod \"barbican-operator-controller-manager-6db6876945-nlssq\" (UID: \"9edc7b0f-b45b-4110-bb13-ea19ea3442c0\") " pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:27.575070 master-0 kubenswrapper[16352]: I0307 21:38:27.570397 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:27.579475 master-0 kubenswrapper[16352]: I0307 21:38:27.577853 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmqrz\" (UniqueName: \"kubernetes.io/projected/dcba24dc-efc6-4686-9782-ac72720cbe35-kube-api-access-kmqrz\") pod \"designate-operator-controller-manager-5d87c9d997-jzt22\" (UID: \"dcba24dc-efc6-4686-9782-ac72720cbe35\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:27.579475 master-0 kubenswrapper[16352]: I0307 21:38:27.577955 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsggk\" (UniqueName: \"kubernetes.io/projected/303ed055-819e-4c32-8f99-7feec7ded5e3-kube-api-access-gsggk\") pod \"glance-operator-controller-manager-64db6967f8-mq69x\" (UID: \"303ed055-819e-4c32-8f99-7feec7ded5e3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:27.579475 master-0 kubenswrapper[16352]: I0307 21:38:27.578039 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqrlk\" (UniqueName: \"kubernetes.io/projected/6d37d2c9-97c2-49ea-9629-aa17591de886-kube-api-access-jqrlk\") pod \"heat-operator-controller-manager-cf99c678f-qmcr7\" (UID: \"6d37d2c9-97c2-49ea-9629-aa17591de886\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:27.608613 master-0 kubenswrapper[16352]: I0307 21:38:27.607365 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqrlk\" (UniqueName: \"kubernetes.io/projected/6d37d2c9-97c2-49ea-9629-aa17591de886-kube-api-access-jqrlk\") pod \"heat-operator-controller-manager-cf99c678f-qmcr7\" (UID: \"6d37d2c9-97c2-49ea-9629-aa17591de886\") " pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:27.608613 master-0 kubenswrapper[16352]: I0307 21:38:27.607452 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2"] Mar 07 21:38:27.608994 master-0 kubenswrapper[16352]: I0307 21:38:27.608967 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:27.609789 master-0 kubenswrapper[16352]: I0307 21:38:27.609737 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmqrz\" (UniqueName: \"kubernetes.io/projected/dcba24dc-efc6-4686-9782-ac72720cbe35-kube-api-access-kmqrz\") pod \"designate-operator-controller-manager-5d87c9d997-jzt22\" (UID: \"dcba24dc-efc6-4686-9782-ac72720cbe35\") " pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:27.610544 master-0 kubenswrapper[16352]: I0307 21:38:27.610455 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsggk\" (UniqueName: \"kubernetes.io/projected/303ed055-819e-4c32-8f99-7feec7ded5e3-kube-api-access-gsggk\") pod \"glance-operator-controller-manager-64db6967f8-mq69x\" (UID: \"303ed055-819e-4c32-8f99-7feec7ded5e3\") " pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:27.614760 master-0 kubenswrapper[16352]: I0307 21:38:27.614637 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:27.622893 master-0 kubenswrapper[16352]: I0307 21:38:27.622825 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2"] Mar 07 21:38:27.644533 master-0 kubenswrapper[16352]: I0307 21:38:27.644427 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:27.662663 master-0 kubenswrapper[16352]: I0307 21:38:27.662584 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h"] Mar 07 21:38:27.665605 master-0 kubenswrapper[16352]: I0307 21:38:27.665576 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:27.722662 master-0 kubenswrapper[16352]: I0307 21:38:27.722599 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 07 21:38:27.723463 master-0 kubenswrapper[16352]: I0307 21:38:27.723358 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:27.745301 master-0 kubenswrapper[16352]: I0307 21:38:27.745075 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:27.769875 master-0 kubenswrapper[16352]: I0307 21:38:27.769811 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h"] Mar 07 21:38:27.854715 master-0 kubenswrapper[16352]: I0307 21:38:27.852254 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fhsnr\" (UniqueName: \"kubernetes.io/projected/c333624b-10f8-40cf-95ed-52fc6ccea867-kube-api-access-fhsnr\") pod \"horizon-operator-controller-manager-78bc7f9bd9-rcxp2\" (UID: \"c333624b-10f8-40cf-95ed-52fc6ccea867\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:27.854715 master-0 kubenswrapper[16352]: I0307 21:38:27.852394 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:27.854715 master-0 kubenswrapper[16352]: I0307 21:38:27.852467 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-74d4m\" (UniqueName: \"kubernetes.io/projected/410492a4-07d7-4b98-9c15-40200fe85474-kube-api-access-74d4m\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:27.958880 master-0 kubenswrapper[16352]: I0307 21:38:27.954947 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w"] Mar 07 21:38:27.958880 master-0 kubenswrapper[16352]: I0307 21:38:27.956810 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:27.967883 master-0 kubenswrapper[16352]: I0307 21:38:27.959658 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-74d4m\" (UniqueName: \"kubernetes.io/projected/410492a4-07d7-4b98-9c15-40200fe85474-kube-api-access-74d4m\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:27.969587 master-0 kubenswrapper[16352]: I0307 21:38:27.968444 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fhsnr\" (UniqueName: \"kubernetes.io/projected/c333624b-10f8-40cf-95ed-52fc6ccea867-kube-api-access-fhsnr\") pod \"horizon-operator-controller-manager-78bc7f9bd9-rcxp2\" (UID: \"c333624b-10f8-40cf-95ed-52fc6ccea867\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:27.969587 master-0 kubenswrapper[16352]: I0307 21:38:27.968769 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:27.969587 master-0 kubenswrapper[16352]: E0307 21:38:27.969128 16352 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:27.969587 master-0 kubenswrapper[16352]: E0307 21:38:27.969206 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert podName:410492a4-07d7-4b98-9c15-40200fe85474 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:28.469175444 +0000 UTC m=+1231.539880503 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert") pod "infra-operator-controller-manager-65b58d74b-rrd9h" (UID: "410492a4-07d7-4b98-9c15-40200fe85474") : secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:28.003607 master-0 kubenswrapper[16352]: I0307 21:38:28.003558 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c"] Mar 07 21:38:28.005638 master-0 kubenswrapper[16352]: I0307 21:38:28.005617 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:28.013794 master-0 kubenswrapper[16352]: I0307 21:38:28.013738 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w"] Mar 07 21:38:28.025450 master-0 kubenswrapper[16352]: I0307 21:38:28.023947 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz"] Mar 07 21:38:28.025610 master-0 kubenswrapper[16352]: I0307 21:38:28.025514 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:28.072053 master-0 kubenswrapper[16352]: I0307 21:38:28.071013 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkqpb\" (UniqueName: \"kubernetes.io/projected/e98642c0-4c59-4b2a-98ba-7f8ae096c57f-kube-api-access-mkqpb\") pod \"ironic-operator-controller-manager-545456dc4-xth7w\" (UID: \"e98642c0-4c59-4b2a-98ba-7f8ae096c57f\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:28.086090 master-0 kubenswrapper[16352]: I0307 21:38:28.085655 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c"] Mar 07 21:38:28.099962 master-0 kubenswrapper[16352]: I0307 21:38:28.096340 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz"] Mar 07 21:38:28.115964 master-0 kubenswrapper[16352]: I0307 21:38:28.107590 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g"] Mar 07 21:38:28.115964 master-0 kubenswrapper[16352]: I0307 21:38:28.109488 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:28.126796 master-0 kubenswrapper[16352]: I0307 21:38:28.125802 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt"] Mar 07 21:38:28.131374 master-0 kubenswrapper[16352]: I0307 21:38:28.131126 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:28.135122 master-0 kubenswrapper[16352]: I0307 21:38:28.134531 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-74d4m\" (UniqueName: \"kubernetes.io/projected/410492a4-07d7-4b98-9c15-40200fe85474-kube-api-access-74d4m\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:28.138188 master-0 kubenswrapper[16352]: I0307 21:38:28.138134 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fhsnr\" (UniqueName: \"kubernetes.io/projected/c333624b-10f8-40cf-95ed-52fc6ccea867-kube-api-access-fhsnr\") pod \"horizon-operator-controller-manager-78bc7f9bd9-rcxp2\" (UID: \"c333624b-10f8-40cf-95ed-52fc6ccea867\") " pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:28.162257 master-0 kubenswrapper[16352]: I0307 21:38:28.161984 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g"] Mar 07 21:38:28.173265 master-0 kubenswrapper[16352]: I0307 21:38:28.173202 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt"] Mar 07 21:38:28.174251 master-0 kubenswrapper[16352]: I0307 21:38:28.174192 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6z88\" (UniqueName: \"kubernetes.io/projected/04444749-e365-4e55-b556-245fb7b416a2-kube-api-access-c6z88\") pod \"keystone-operator-controller-manager-7c789f89c6-zq79c\" (UID: \"04444749-e365-4e55-b556-245fb7b416a2\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:28.174321 master-0 kubenswrapper[16352]: I0307 21:38:28.174272 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkqpb\" (UniqueName: \"kubernetes.io/projected/e98642c0-4c59-4b2a-98ba-7f8ae096c57f-kube-api-access-mkqpb\") pod \"ironic-operator-controller-manager-545456dc4-xth7w\" (UID: \"e98642c0-4c59-4b2a-98ba-7f8ae096c57f\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:28.174370 master-0 kubenswrapper[16352]: I0307 21:38:28.174351 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x26t\" (UniqueName: \"kubernetes.io/projected/60f0e1d8-44b8-4b6d-8804-0cc47734a848-kube-api-access-5x26t\") pod \"mariadb-operator-controller-manager-7b6bfb6475-j288g\" (UID: \"60f0e1d8-44b8-4b6d-8804-0cc47734a848\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:28.174422 master-0 kubenswrapper[16352]: I0307 21:38:28.174404 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5d84v\" (UniqueName: \"kubernetes.io/projected/354439b7-abea-4781-9fdd-835ef380edb5-kube-api-access-5d84v\") pod \"neutron-operator-controller-manager-54688575f-vj8dt\" (UID: \"354439b7-abea-4781-9fdd-835ef380edb5\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:28.174460 master-0 kubenswrapper[16352]: I0307 21:38:28.174444 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-94fv9\" (UniqueName: \"kubernetes.io/projected/349cead9-443f-4a05-a890-6a85791d27c4-kube-api-access-94fv9\") pod \"manila-operator-controller-manager-67d996989d-7ksrz\" (UID: \"349cead9-443f-4a05-a890-6a85791d27c4\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:28.185905 master-0 kubenswrapper[16352]: I0307 21:38:28.185856 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt"] Mar 07 21:38:28.187498 master-0 kubenswrapper[16352]: I0307 21:38:28.187474 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:28.192533 master-0 kubenswrapper[16352]: I0307 21:38:28.191569 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkqpb\" (UniqueName: \"kubernetes.io/projected/e98642c0-4c59-4b2a-98ba-7f8ae096c57f-kube-api-access-mkqpb\") pod \"ironic-operator-controller-manager-545456dc4-xth7w\" (UID: \"e98642c0-4c59-4b2a-98ba-7f8ae096c57f\") " pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:28.194148 master-0 kubenswrapper[16352]: I0307 21:38:28.194092 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt"] Mar 07 21:38:28.202343 master-0 kubenswrapper[16352]: I0307 21:38:28.202308 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq"] Mar 07 21:38:28.207779 master-0 kubenswrapper[16352]: I0307 21:38:28.204714 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:28.213943 master-0 kubenswrapper[16352]: I0307 21:38:28.210874 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5"] Mar 07 21:38:28.213943 master-0 kubenswrapper[16352]: I0307 21:38:28.212591 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.221334 master-0 kubenswrapper[16352]: I0307 21:38:28.221032 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4"] Mar 07 21:38:28.222256 master-0 kubenswrapper[16352]: I0307 21:38:28.221722 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 07 21:38:28.228342 master-0 kubenswrapper[16352]: I0307 21:38:28.228289 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:28.231388 master-0 kubenswrapper[16352]: I0307 21:38:28.230747 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq"] Mar 07 21:38:28.242815 master-0 kubenswrapper[16352]: I0307 21:38:28.241627 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4"] Mar 07 21:38:28.260998 master-0 kubenswrapper[16352]: I0307 21:38:28.259936 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5"] Mar 07 21:38:28.271645 master-0 kubenswrapper[16352]: I0307 21:38:28.271563 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-l7256"] Mar 07 21:38:28.273716 master-0 kubenswrapper[16352]: I0307 21:38:28.273402 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276280 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4vb4\" (UniqueName: \"kubernetes.io/projected/e58eb7fb-23b7-44ef-912c-d4dac0a71277-kube-api-access-m4vb4\") pod \"nova-operator-controller-manager-74b6b5dc96-ndppt\" (UID: \"e58eb7fb-23b7-44ef-912c-d4dac0a71277\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276374 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x26t\" (UniqueName: \"kubernetes.io/projected/60f0e1d8-44b8-4b6d-8804-0cc47734a848-kube-api-access-5x26t\") pod \"mariadb-operator-controller-manager-7b6bfb6475-j288g\" (UID: \"60f0e1d8-44b8-4b6d-8804-0cc47734a848\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276401 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sd9c\" (UniqueName: \"kubernetes.io/projected/733393c5-1380-4bb8-99e0-aa1757e573ab-kube-api-access-4sd9c\") pod \"placement-operator-controller-manager-648564c9fc-l7256\" (UID: \"733393c5-1380-4bb8-99e0-aa1757e573ab\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276433 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276457 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwm5q\" (UniqueName: \"kubernetes.io/projected/90729f66-afc2-451e-b136-964efe09b675-kube-api-access-nwm5q\") pod \"ovn-operator-controller-manager-75684d597f-ccbn4\" (UID: \"90729f66-afc2-451e-b136-964efe09b675\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276478 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5d84v\" (UniqueName: \"kubernetes.io/projected/354439b7-abea-4781-9fdd-835ef380edb5-kube-api-access-5d84v\") pod \"neutron-operator-controller-manager-54688575f-vj8dt\" (UID: \"354439b7-abea-4781-9fdd-835ef380edb5\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276508 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-94fv9\" (UniqueName: \"kubernetes.io/projected/349cead9-443f-4a05-a890-6a85791d27c4-kube-api-access-94fv9\") pod \"manila-operator-controller-manager-67d996989d-7ksrz\" (UID: \"349cead9-443f-4a05-a890-6a85791d27c4\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276547 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9fxw\" (UniqueName: \"kubernetes.io/projected/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-kube-api-access-g9fxw\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276581 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6z88\" (UniqueName: \"kubernetes.io/projected/04444749-e365-4e55-b556-245fb7b416a2-kube-api-access-c6z88\") pod \"keystone-operator-controller-manager-7c789f89c6-zq79c\" (UID: \"04444749-e365-4e55-b556-245fb7b416a2\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:28.279282 master-0 kubenswrapper[16352]: I0307 21:38:28.276599 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rndlq\" (UniqueName: \"kubernetes.io/projected/cf92475a-b059-4a3d-8781-f3c49c45fde1-kube-api-access-rndlq\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2plwq\" (UID: \"cf92475a-b059-4a3d-8781-f3c49c45fde1\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:28.294021 master-0 kubenswrapper[16352]: I0307 21:38:28.283314 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw"] Mar 07 21:38:28.294021 master-0 kubenswrapper[16352]: I0307 21:38:28.291018 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:28.301879 master-0 kubenswrapper[16352]: I0307 21:38:28.301834 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-l7256"] Mar 07 21:38:28.308081 master-0 kubenswrapper[16352]: I0307 21:38:28.307174 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x26t\" (UniqueName: \"kubernetes.io/projected/60f0e1d8-44b8-4b6d-8804-0cc47734a848-kube-api-access-5x26t\") pod \"mariadb-operator-controller-manager-7b6bfb6475-j288g\" (UID: \"60f0e1d8-44b8-4b6d-8804-0cc47734a848\") " pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:28.308081 master-0 kubenswrapper[16352]: I0307 21:38:28.307377 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6z88\" (UniqueName: \"kubernetes.io/projected/04444749-e365-4e55-b556-245fb7b416a2-kube-api-access-c6z88\") pod \"keystone-operator-controller-manager-7c789f89c6-zq79c\" (UID: \"04444749-e365-4e55-b556-245fb7b416a2\") " pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:28.308081 master-0 kubenswrapper[16352]: I0307 21:38:28.308049 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:28.308340 master-0 kubenswrapper[16352]: I0307 21:38:28.308174 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-94fv9\" (UniqueName: \"kubernetes.io/projected/349cead9-443f-4a05-a890-6a85791d27c4-kube-api-access-94fv9\") pod \"manila-operator-controller-manager-67d996989d-7ksrz\" (UID: \"349cead9-443f-4a05-a890-6a85791d27c4\") " pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:28.345064 master-0 kubenswrapper[16352]: I0307 21:38:28.344309 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5d84v\" (UniqueName: \"kubernetes.io/projected/354439b7-abea-4781-9fdd-835ef380edb5-kube-api-access-5d84v\") pod \"neutron-operator-controller-manager-54688575f-vj8dt\" (UID: \"354439b7-abea-4781-9fdd-835ef380edb5\") " pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:28.369318 master-0 kubenswrapper[16352]: I0307 21:38:28.368827 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw"] Mar 07 21:38:28.372307 master-0 kubenswrapper[16352]: I0307 21:38:28.371379 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:28.373535 master-0 kubenswrapper[16352]: I0307 21:38:28.372915 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:28.378162 master-0 kubenswrapper[16352]: I0307 21:38:28.377625 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nwm5q\" (UniqueName: \"kubernetes.io/projected/90729f66-afc2-451e-b136-964efe09b675-kube-api-access-nwm5q\") pod \"ovn-operator-controller-manager-75684d597f-ccbn4\" (UID: \"90729f66-afc2-451e-b136-964efe09b675\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:28.378162 master-0 kubenswrapper[16352]: I0307 21:38:28.377749 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9fxw\" (UniqueName: \"kubernetes.io/projected/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-kube-api-access-g9fxw\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.378162 master-0 kubenswrapper[16352]: I0307 21:38:28.378019 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rndlq\" (UniqueName: \"kubernetes.io/projected/cf92475a-b059-4a3d-8781-f3c49c45fde1-kube-api-access-rndlq\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2plwq\" (UID: \"cf92475a-b059-4a3d-8781-f3c49c45fde1\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:28.378308 master-0 kubenswrapper[16352]: I0307 21:38:28.378178 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4vb4\" (UniqueName: \"kubernetes.io/projected/e58eb7fb-23b7-44ef-912c-d4dac0a71277-kube-api-access-m4vb4\") pod \"nova-operator-controller-manager-74b6b5dc96-ndppt\" (UID: \"e58eb7fb-23b7-44ef-912c-d4dac0a71277\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:28.378308 master-0 kubenswrapper[16352]: I0307 21:38:28.378253 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sd9c\" (UniqueName: \"kubernetes.io/projected/733393c5-1380-4bb8-99e0-aa1757e573ab-kube-api-access-4sd9c\") pod \"placement-operator-controller-manager-648564c9fc-l7256\" (UID: \"733393c5-1380-4bb8-99e0-aa1757e573ab\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:28.378308 master-0 kubenswrapper[16352]: I0307 21:38:28.378288 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.378405 master-0 kubenswrapper[16352]: E0307 21:38:28.378378 16352 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:28.378438 master-0 kubenswrapper[16352]: E0307 21:38:28.378432 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert podName:4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:28.878416629 +0000 UTC m=+1231.949121688 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" (UID: "4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:28.416994 master-0 kubenswrapper[16352]: I0307 21:38:28.402597 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rndlq\" (UniqueName: \"kubernetes.io/projected/cf92475a-b059-4a3d-8781-f3c49c45fde1-kube-api-access-rndlq\") pod \"octavia-operator-controller-manager-5d86c7ddb7-2plwq\" (UID: \"cf92475a-b059-4a3d-8781-f3c49c45fde1\") " pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:28.416994 master-0 kubenswrapper[16352]: I0307 21:38:28.404125 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwm5q\" (UniqueName: \"kubernetes.io/projected/90729f66-afc2-451e-b136-964efe09b675-kube-api-access-nwm5q\") pod \"ovn-operator-controller-manager-75684d597f-ccbn4\" (UID: \"90729f66-afc2-451e-b136-964efe09b675\") " pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:28.416994 master-0 kubenswrapper[16352]: I0307 21:38:28.409069 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:28.416994 master-0 kubenswrapper[16352]: I0307 21:38:28.409076 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sd9c\" (UniqueName: \"kubernetes.io/projected/733393c5-1380-4bb8-99e0-aa1757e573ab-kube-api-access-4sd9c\") pod \"placement-operator-controller-manager-648564c9fc-l7256\" (UID: \"733393c5-1380-4bb8-99e0-aa1757e573ab\") " pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:28.416994 master-0 kubenswrapper[16352]: I0307 21:38:28.413047 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9fxw\" (UniqueName: \"kubernetes.io/projected/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-kube-api-access-g9fxw\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.423323 master-0 kubenswrapper[16352]: I0307 21:38:28.423271 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4vb4\" (UniqueName: \"kubernetes.io/projected/e58eb7fb-23b7-44ef-912c-d4dac0a71277-kube-api-access-m4vb4\") pod \"nova-operator-controller-manager-74b6b5dc96-ndppt\" (UID: \"e58eb7fb-23b7-44ef-912c-d4dac0a71277\") " pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:28.440451 master-0 kubenswrapper[16352]: I0307 21:38:28.438124 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt"] Mar 07 21:38:28.446944 master-0 kubenswrapper[16352]: I0307 21:38:28.445345 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:28.453568 master-0 kubenswrapper[16352]: I0307 21:38:28.453044 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt"] Mar 07 21:38:28.460264 master-0 kubenswrapper[16352]: I0307 21:38:28.459889 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:28.469382 master-0 kubenswrapper[16352]: I0307 21:38:28.468871 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2"] Mar 07 21:38:28.474010 master-0 kubenswrapper[16352]: I0307 21:38:28.473193 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:28.474010 master-0 kubenswrapper[16352]: I0307 21:38:28.473870 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:28.530302 master-0 kubenswrapper[16352]: I0307 21:38:28.529643 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:28.532653 master-0 kubenswrapper[16352]: I0307 21:38:28.531906 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:28.532653 master-0 kubenswrapper[16352]: I0307 21:38:28.532118 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tb9x6\" (UniqueName: \"kubernetes.io/projected/caf39759-7091-4ef5-8013-c7a487f72d76-kube-api-access-tb9x6\") pod \"swift-operator-controller-manager-9b9ff9f4d-s8tqw\" (UID: \"caf39759-7091-4ef5-8013-c7a487f72d76\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:28.532653 master-0 kubenswrapper[16352]: E0307 21:38:28.532258 16352 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:28.532653 master-0 kubenswrapper[16352]: E0307 21:38:28.532340 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert podName:410492a4-07d7-4b98-9c15-40200fe85474 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:29.532312246 +0000 UTC m=+1232.603017305 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert") pod "infra-operator-controller-manager-65b58d74b-rrd9h" (UID: "410492a4-07d7-4b98-9c15-40200fe85474") : secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:28.548076 master-0 kubenswrapper[16352]: I0307 21:38:28.547389 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2"] Mar 07 21:38:28.555584 master-0 kubenswrapper[16352]: I0307 21:38:28.554641 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:28.573356 master-0 kubenswrapper[16352]: I0307 21:38:28.572487 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm"] Mar 07 21:38:28.575671 master-0 kubenswrapper[16352]: I0307 21:38:28.574802 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:28.584106 master-0 kubenswrapper[16352]: I0307 21:38:28.583426 16352 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:38:28.596856 master-0 kubenswrapper[16352]: I0307 21:38:28.596617 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm"] Mar 07 21:38:28.637671 master-0 kubenswrapper[16352]: I0307 21:38:28.636829 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bs6wc\" (UniqueName: \"kubernetes.io/projected/848e3768-065a-4a9b-bc8c-eb34b6eb2560-kube-api-access-bs6wc\") pod \"test-operator-controller-manager-55b5ff4dbb-9cpc2\" (UID: \"848e3768-065a-4a9b-bc8c-eb34b6eb2560\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:28.637671 master-0 kubenswrapper[16352]: I0307 21:38:28.636927 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98xgw\" (UniqueName: \"kubernetes.io/projected/eed655d4-c4c5-4dbf-86ff-a5ac776d428d-kube-api-access-98xgw\") pod \"watcher-operator-controller-manager-bccc79885-k5rcm\" (UID: \"eed655d4-c4c5-4dbf-86ff-a5ac776d428d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:28.637671 master-0 kubenswrapper[16352]: I0307 21:38:28.637044 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnws8\" (UniqueName: \"kubernetes.io/projected/01456598-46a5-49a8-8da8-3f47d303fd88-kube-api-access-fnws8\") pod \"telemetry-operator-controller-manager-5fdb694969-bbqxt\" (UID: \"01456598-46a5-49a8-8da8-3f47d303fd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:28.637671 master-0 kubenswrapper[16352]: I0307 21:38:28.637198 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tb9x6\" (UniqueName: \"kubernetes.io/projected/caf39759-7091-4ef5-8013-c7a487f72d76-kube-api-access-tb9x6\") pod \"swift-operator-controller-manager-9b9ff9f4d-s8tqw\" (UID: \"caf39759-7091-4ef5-8013-c7a487f72d76\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:28.656726 master-0 kubenswrapper[16352]: I0307 21:38:28.656050 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tb9x6\" (UniqueName: \"kubernetes.io/projected/caf39759-7091-4ef5-8013-c7a487f72d76-kube-api-access-tb9x6\") pod \"swift-operator-controller-manager-9b9ff9f4d-s8tqw\" (UID: \"caf39759-7091-4ef5-8013-c7a487f72d76\") " pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:28.657794 master-0 kubenswrapper[16352]: I0307 21:38:28.657257 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:28.666547 master-0 kubenswrapper[16352]: I0307 21:38:28.665828 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr"] Mar 07 21:38:28.677354 master-0 kubenswrapper[16352]: I0307 21:38:28.676012 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.677354 master-0 kubenswrapper[16352]: I0307 21:38:28.677093 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:28.680342 master-0 kubenswrapper[16352]: I0307 21:38:28.679470 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 07 21:38:28.680342 master-0 kubenswrapper[16352]: I0307 21:38:28.679698 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 07 21:38:28.683218 master-0 kubenswrapper[16352]: I0307 21:38:28.682460 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr"] Mar 07 21:38:28.742291 master-0 kubenswrapper[16352]: I0307 21:38:28.725937 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9"] Mar 07 21:38:28.742291 master-0 kubenswrapper[16352]: I0307 21:38:28.727382 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" Mar 07 21:38:28.742291 master-0 kubenswrapper[16352]: I0307 21:38:28.735453 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9"] Mar 07 21:38:28.748175 master-0 kubenswrapper[16352]: I0307 21:38:28.748136 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.749710 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j5zmt\" (UniqueName: \"kubernetes.io/projected/a1640d02-1d34-4805-9f58-de4973079a0d-kube-api-access-j5zmt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r8xj9\" (UID: \"a1640d02-1d34-4805-9f58-de4973079a0d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.749769 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bs6wc\" (UniqueName: \"kubernetes.io/projected/848e3768-065a-4a9b-bc8c-eb34b6eb2560-kube-api-access-bs6wc\") pod \"test-operator-controller-manager-55b5ff4dbb-9cpc2\" (UID: \"848e3768-065a-4a9b-bc8c-eb34b6eb2560\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.749792 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkjmd\" (UniqueName: \"kubernetes.io/projected/becc66df-cd3f-48e5-8684-f0cf6fabd257-kube-api-access-pkjmd\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.749820 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98xgw\" (UniqueName: \"kubernetes.io/projected/eed655d4-c4c5-4dbf-86ff-a5ac776d428d-kube-api-access-98xgw\") pod \"watcher-operator-controller-manager-bccc79885-k5rcm\" (UID: \"eed655d4-c4c5-4dbf-86ff-a5ac776d428d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.749842 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.750514 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnws8\" (UniqueName: \"kubernetes.io/projected/01456598-46a5-49a8-8da8-3f47d303fd88-kube-api-access-fnws8\") pod \"telemetry-operator-controller-manager-5fdb694969-bbqxt\" (UID: \"01456598-46a5-49a8-8da8-3f47d303fd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:28.750809 master-0 kubenswrapper[16352]: I0307 21:38:28.750551 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.778152 master-0 kubenswrapper[16352]: I0307 21:38:28.778100 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98xgw\" (UniqueName: \"kubernetes.io/projected/eed655d4-c4c5-4dbf-86ff-a5ac776d428d-kube-api-access-98xgw\") pod \"watcher-operator-controller-manager-bccc79885-k5rcm\" (UID: \"eed655d4-c4c5-4dbf-86ff-a5ac776d428d\") " pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:28.779358 master-0 kubenswrapper[16352]: I0307 21:38:28.779306 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bs6wc\" (UniqueName: \"kubernetes.io/projected/848e3768-065a-4a9b-bc8c-eb34b6eb2560-kube-api-access-bs6wc\") pod \"test-operator-controller-manager-55b5ff4dbb-9cpc2\" (UID: \"848e3768-065a-4a9b-bc8c-eb34b6eb2560\") " pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:28.779458 master-0 kubenswrapper[16352]: I0307 21:38:28.779393 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnws8\" (UniqueName: \"kubernetes.io/projected/01456598-46a5-49a8-8da8-3f47d303fd88-kube-api-access-fnws8\") pod \"telemetry-operator-controller-manager-5fdb694969-bbqxt\" (UID: \"01456598-46a5-49a8-8da8-3f47d303fd88\") " pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:28.803461 master-0 kubenswrapper[16352]: I0307 21:38:28.780836 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:28.803461 master-0 kubenswrapper[16352]: I0307 21:38:28.790739 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h"] Mar 07 21:38:28.851800 master-0 kubenswrapper[16352]: I0307 21:38:28.851733 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.851999 master-0 kubenswrapper[16352]: I0307 21:38:28.851835 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j5zmt\" (UniqueName: \"kubernetes.io/projected/a1640d02-1d34-4805-9f58-de4973079a0d-kube-api-access-j5zmt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r8xj9\" (UID: \"a1640d02-1d34-4805-9f58-de4973079a0d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" Mar 07 21:38:28.851999 master-0 kubenswrapper[16352]: I0307 21:38:28.851889 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkjmd\" (UniqueName: \"kubernetes.io/projected/becc66df-cd3f-48e5-8684-f0cf6fabd257-kube-api-access-pkjmd\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.851999 master-0 kubenswrapper[16352]: I0307 21:38:28.851930 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.852155 master-0 kubenswrapper[16352]: E0307 21:38:28.852085 16352 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 21:38:28.852155 master-0 kubenswrapper[16352]: E0307 21:38:28.852139 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:29.352123808 +0000 UTC m=+1232.422828867 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "metrics-server-cert" not found Mar 07 21:38:28.852221 master-0 kubenswrapper[16352]: E0307 21:38:28.852182 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:28.852221 master-0 kubenswrapper[16352]: E0307 21:38:28.852201 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:29.35219531 +0000 UTC m=+1232.422900369 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:28.874302 master-0 kubenswrapper[16352]: I0307 21:38:28.874237 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j5zmt\" (UniqueName: \"kubernetes.io/projected/a1640d02-1d34-4805-9f58-de4973079a0d-kube-api-access-j5zmt\") pod \"rabbitmq-cluster-operator-manager-668c99d594-r8xj9\" (UID: \"a1640d02-1d34-4805-9f58-de4973079a0d\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" Mar 07 21:38:28.876830 master-0 kubenswrapper[16352]: I0307 21:38:28.875165 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkjmd\" (UniqueName: \"kubernetes.io/projected/becc66df-cd3f-48e5-8684-f0cf6fabd257-kube-api-access-pkjmd\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:28.876830 master-0 kubenswrapper[16352]: I0307 21:38:28.876209 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7"] Mar 07 21:38:28.953420 master-0 kubenswrapper[16352]: I0307 21:38:28.953381 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:28.953942 master-0 kubenswrapper[16352]: E0307 21:38:28.953889 16352 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:28.954252 master-0 kubenswrapper[16352]: E0307 21:38:28.953973 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert podName:4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:29.953953128 +0000 UTC m=+1233.024658187 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" (UID: "4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:28.955256 master-0 kubenswrapper[16352]: I0307 21:38:28.955185 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x"] Mar 07 21:38:28.978601 master-0 kubenswrapper[16352]: I0307 21:38:28.978362 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq"] Mar 07 21:38:28.992714 master-0 kubenswrapper[16352]: I0307 21:38:28.992647 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22"] Mar 07 21:38:29.032641 master-0 kubenswrapper[16352]: W0307 21:38:29.032524 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddcba24dc_efc6_4686_9782_ac72720cbe35.slice/crio-ef0933ea1eda409d46f999484d3f33fda16d43b62c06be43345d39c297e349df WatchSource:0}: Error finding container ef0933ea1eda409d46f999484d3f33fda16d43b62c06be43345d39c297e349df: Status 404 returned error can't find the container with id ef0933ea1eda409d46f999484d3f33fda16d43b62c06be43345d39c297e349df Mar 07 21:38:29.051999 master-0 kubenswrapper[16352]: I0307 21:38:29.051918 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:29.074896 master-0 kubenswrapper[16352]: I0307 21:38:29.074833 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:29.128350 master-0 kubenswrapper[16352]: I0307 21:38:29.126472 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" Mar 07 21:38:29.398572 master-0 kubenswrapper[16352]: I0307 21:38:29.397152 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:29.398572 master-0 kubenswrapper[16352]: I0307 21:38:29.397393 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:29.398572 master-0 kubenswrapper[16352]: E0307 21:38:29.397975 16352 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 21:38:29.398572 master-0 kubenswrapper[16352]: E0307 21:38:29.398033 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:30.398013536 +0000 UTC m=+1233.468718615 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "metrics-server-cert" not found Mar 07 21:38:29.400217 master-0 kubenswrapper[16352]: E0307 21:38:29.399030 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:29.400217 master-0 kubenswrapper[16352]: E0307 21:38:29.399098 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:30.399074342 +0000 UTC m=+1233.469779401 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:29.582042 master-0 kubenswrapper[16352]: I0307 21:38:29.581919 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" event={"ID":"dcba24dc-efc6-4686-9782-ac72720cbe35","Type":"ContainerStarted","Data":"ef0933ea1eda409d46f999484d3f33fda16d43b62c06be43345d39c297e349df"} Mar 07 21:38:29.585670 master-0 kubenswrapper[16352]: I0307 21:38:29.584569 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz"] Mar 07 21:38:29.590721 master-0 kubenswrapper[16352]: I0307 21:38:29.587280 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" event={"ID":"6d37d2c9-97c2-49ea-9629-aa17591de886","Type":"ContainerStarted","Data":"41b17f489cc745bf5b4c5772775b5d9621df40aa73bf5fff295781905c3bec0b"} Mar 07 21:38:29.603639 master-0 kubenswrapper[16352]: I0307 21:38:29.602540 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2"] Mar 07 21:38:29.603639 master-0 kubenswrapper[16352]: I0307 21:38:29.602551 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:29.603639 master-0 kubenswrapper[16352]: E0307 21:38:29.602893 16352 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:29.603639 master-0 kubenswrapper[16352]: E0307 21:38:29.602983 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert podName:410492a4-07d7-4b98-9c15-40200fe85474 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:31.602959706 +0000 UTC m=+1234.673664765 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert") pod "infra-operator-controller-manager-65b58d74b-rrd9h" (UID: "410492a4-07d7-4b98-9c15-40200fe85474") : secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:29.616615 master-0 kubenswrapper[16352]: W0307 21:38:29.615531 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod349cead9_443f_4a05_a890_6a85791d27c4.slice/crio-db0fd5245c676f6beb1d6d9dae4f8265eb92cb3d7af5e09c0a104eb7194dbaa2 WatchSource:0}: Error finding container db0fd5245c676f6beb1d6d9dae4f8265eb92cb3d7af5e09c0a104eb7194dbaa2: Status 404 returned error can't find the container with id db0fd5245c676f6beb1d6d9dae4f8265eb92cb3d7af5e09c0a104eb7194dbaa2 Mar 07 21:38:29.621086 master-0 kubenswrapper[16352]: I0307 21:38:29.621027 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" event={"ID":"b4e76285-f40c-4114-b9f1-fcf795be20cc","Type":"ContainerStarted","Data":"a847c83d7448ef394d79c1dbc15cc0225815ae84acf54bfd8b4a7829c3714768"} Mar 07 21:38:29.633896 master-0 kubenswrapper[16352]: I0307 21:38:29.633715 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" event={"ID":"303ed055-819e-4c32-8f99-7feec7ded5e3","Type":"ContainerStarted","Data":"50e43ef05d9c0c8d33e51db6663f396b0b926fa0bf81c51d71becde095b83d8e"} Mar 07 21:38:29.659656 master-0 kubenswrapper[16352]: I0307 21:38:29.658161 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" event={"ID":"9edc7b0f-b45b-4110-bb13-ea19ea3442c0","Type":"ContainerStarted","Data":"c38b00e8737c730b3d8e2cb4ddeec9d4ccb0698efd9495977c59d04b0c2dbec3"} Mar 07 21:38:29.666910 master-0 kubenswrapper[16352]: W0307 21:38:29.666853 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60f0e1d8_44b8_4b6d_8804_0cc47734a848.slice/crio-cd9d75c8fdfd1bbc429921cfd22bb61839d0309807c1d311320e22ef8e6bbf94 WatchSource:0}: Error finding container cd9d75c8fdfd1bbc429921cfd22bb61839d0309807c1d311320e22ef8e6bbf94: Status 404 returned error can't find the container with id cd9d75c8fdfd1bbc429921cfd22bb61839d0309807c1d311320e22ef8e6bbf94 Mar 07 21:38:29.723190 master-0 kubenswrapper[16352]: I0307 21:38:29.723127 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g"] Mar 07 21:38:29.775643 master-0 kubenswrapper[16352]: I0307 21:38:29.775518 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w"] Mar 07 21:38:29.818167 master-0 kubenswrapper[16352]: I0307 21:38:29.818067 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt"] Mar 07 21:38:29.848308 master-0 kubenswrapper[16352]: I0307 21:38:29.848154 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c"] Mar 07 21:38:30.021994 master-0 kubenswrapper[16352]: I0307 21:38:30.021791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:30.022408 master-0 kubenswrapper[16352]: E0307 21:38:30.022185 16352 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:30.022408 master-0 kubenswrapper[16352]: E0307 21:38:30.022338 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert podName:4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:32.022284973 +0000 UTC m=+1235.092990032 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" (UID: "4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:30.434169 master-0 kubenswrapper[16352]: I0307 21:38:30.433964 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:30.434169 master-0 kubenswrapper[16352]: I0307 21:38:30.434131 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:30.437663 master-0 kubenswrapper[16352]: E0307 21:38:30.434321 16352 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 21:38:30.437663 master-0 kubenswrapper[16352]: E0307 21:38:30.434378 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:32.434361695 +0000 UTC m=+1235.505066744 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "metrics-server-cert" not found Mar 07 21:38:30.437663 master-0 kubenswrapper[16352]: E0307 21:38:30.434790 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:30.437663 master-0 kubenswrapper[16352]: E0307 21:38:30.434817 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:32.434809765 +0000 UTC m=+1235.505514824 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:30.614243 master-0 kubenswrapper[16352]: I0307 21:38:30.611333 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq"] Mar 07 21:38:30.621373 master-0 kubenswrapper[16352]: W0307 21:38:30.621322 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda1640d02_1d34_4805_9f58_de4973079a0d.slice/crio-743439d5acaebb56f6d09577dae4664b081a8c756d5430e62bcd1c2beb00a685 WatchSource:0}: Error finding container 743439d5acaebb56f6d09577dae4664b081a8c756d5430e62bcd1c2beb00a685: Status 404 returned error can't find the container with id 743439d5acaebb56f6d09577dae4664b081a8c756d5430e62bcd1c2beb00a685 Mar 07 21:38:30.622642 master-0 kubenswrapper[16352]: I0307 21:38:30.622596 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9"] Mar 07 21:38:30.641935 master-0 kubenswrapper[16352]: I0307 21:38:30.641875 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4"] Mar 07 21:38:30.671618 master-0 kubenswrapper[16352]: I0307 21:38:30.671484 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt"] Mar 07 21:38:30.688248 master-0 kubenswrapper[16352]: I0307 21:38:30.688098 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" event={"ID":"a1640d02-1d34-4805-9f58-de4973079a0d","Type":"ContainerStarted","Data":"743439d5acaebb56f6d09577dae4664b081a8c756d5430e62bcd1c2beb00a685"} Mar 07 21:38:30.692325 master-0 kubenswrapper[16352]: I0307 21:38:30.692286 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" event={"ID":"e98642c0-4c59-4b2a-98ba-7f8ae096c57f","Type":"ContainerStarted","Data":"f617978112c92fa32dd31e7100be2a8582c5b98a94d11775da5887fb6c30c44f"} Mar 07 21:38:30.694936 master-0 kubenswrapper[16352]: I0307 21:38:30.694669 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" event={"ID":"c333624b-10f8-40cf-95ed-52fc6ccea867","Type":"ContainerStarted","Data":"f3cb6bb1e0417c77027cb6dccdd4329e97126fb3956a200b218b27eca314a45c"} Mar 07 21:38:30.695460 master-0 kubenswrapper[16352]: I0307 21:38:30.695389 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt"] Mar 07 21:38:30.697791 master-0 kubenswrapper[16352]: I0307 21:38:30.697752 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" event={"ID":"354439b7-abea-4781-9fdd-835ef380edb5","Type":"ContainerStarted","Data":"624eb234fad32f4efc7f38517fa7e9a85ac50a17e0b51d60143f2701fe3b295b"} Mar 07 21:38:30.700051 master-0 kubenswrapper[16352]: I0307 21:38:30.700010 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" event={"ID":"04444749-e365-4e55-b556-245fb7b416a2","Type":"ContainerStarted","Data":"a30fefaf5801ee454f019c60bfdcc123679225b3417e03b08c3c7f5768cd0043"} Mar 07 21:38:30.702366 master-0 kubenswrapper[16352]: I0307 21:38:30.702321 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" event={"ID":"60f0e1d8-44b8-4b6d-8804-0cc47734a848","Type":"ContainerStarted","Data":"cd9d75c8fdfd1bbc429921cfd22bb61839d0309807c1d311320e22ef8e6bbf94"} Mar 07 21:38:30.704154 master-0 kubenswrapper[16352]: I0307 21:38:30.704094 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" event={"ID":"349cead9-443f-4a05-a890-6a85791d27c4","Type":"ContainerStarted","Data":"db0fd5245c676f6beb1d6d9dae4f8265eb92cb3d7af5e09c0a104eb7194dbaa2"} Mar 07 21:38:30.725929 master-0 kubenswrapper[16352]: I0307 21:38:30.725736 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2"] Mar 07 21:38:30.758671 master-0 kubenswrapper[16352]: I0307 21:38:30.758601 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw"] Mar 07 21:38:30.789777 master-0 kubenswrapper[16352]: I0307 21:38:30.789718 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-648564c9fc-l7256"] Mar 07 21:38:30.811141 master-0 kubenswrapper[16352]: I0307 21:38:30.811064 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm"] Mar 07 21:38:31.667805 master-0 kubenswrapper[16352]: I0307 21:38:31.667706 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:31.668515 master-0 kubenswrapper[16352]: E0307 21:38:31.668060 16352 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:31.668515 master-0 kubenswrapper[16352]: E0307 21:38:31.668240 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert podName:410492a4-07d7-4b98-9c15-40200fe85474 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:35.668206725 +0000 UTC m=+1238.738911794 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert") pod "infra-operator-controller-manager-65b58d74b-rrd9h" (UID: "410492a4-07d7-4b98-9c15-40200fe85474") : secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:32.076192 master-0 kubenswrapper[16352]: I0307 21:38:32.076056 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:32.076668 master-0 kubenswrapper[16352]: E0307 21:38:32.076251 16352 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:32.076668 master-0 kubenswrapper[16352]: E0307 21:38:32.076346 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert podName:4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:36.076324963 +0000 UTC m=+1239.147030022 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" (UID: "4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:32.486045 master-0 kubenswrapper[16352]: I0307 21:38:32.485657 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:32.486571 master-0 kubenswrapper[16352]: E0307 21:38:32.486514 16352 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 21:38:32.486641 master-0 kubenswrapper[16352]: E0307 21:38:32.486610 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:36.486589132 +0000 UTC m=+1239.557294191 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "metrics-server-cert" not found Mar 07 21:38:32.486754 master-0 kubenswrapper[16352]: I0307 21:38:32.486620 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:32.486818 master-0 kubenswrapper[16352]: E0307 21:38:32.486789 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:32.486876 master-0 kubenswrapper[16352]: E0307 21:38:32.486860 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:36.486843578 +0000 UTC m=+1239.557548637 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:32.728619 master-0 kubenswrapper[16352]: I0307 21:38:32.728556 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" event={"ID":"90729f66-afc2-451e-b136-964efe09b675","Type":"ContainerStarted","Data":"e6b67e1783e1cafe7a80e8c33bf791727d9a32c79532e7cf76ddfd40f289dc92"} Mar 07 21:38:33.181238 master-0 kubenswrapper[16352]: W0307 21:38:33.181131 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcaf39759_7091_4ef5_8013_c7a487f72d76.slice/crio-6fb0f96767e55ba78b73783c7a555aff8db1521966bb08370ec1be638e410f9e WatchSource:0}: Error finding container 6fb0f96767e55ba78b73783c7a555aff8db1521966bb08370ec1be638e410f9e: Status 404 returned error can't find the container with id 6fb0f96767e55ba78b73783c7a555aff8db1521966bb08370ec1be638e410f9e Mar 07 21:38:33.191217 master-0 kubenswrapper[16352]: W0307 21:38:33.191140 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01456598_46a5_49a8_8da8_3f47d303fd88.slice/crio-c9109a9958a76030c7b3ce1c753e756e05cc964ad0e70c01550167ebc4ed26bc WatchSource:0}: Error finding container c9109a9958a76030c7b3ce1c753e756e05cc964ad0e70c01550167ebc4ed26bc: Status 404 returned error can't find the container with id c9109a9958a76030c7b3ce1c753e756e05cc964ad0e70c01550167ebc4ed26bc Mar 07 21:38:33.748947 master-0 kubenswrapper[16352]: I0307 21:38:33.745207 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" event={"ID":"caf39759-7091-4ef5-8013-c7a487f72d76","Type":"ContainerStarted","Data":"6fb0f96767e55ba78b73783c7a555aff8db1521966bb08370ec1be638e410f9e"} Mar 07 21:38:33.748947 master-0 kubenswrapper[16352]: I0307 21:38:33.747115 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" event={"ID":"01456598-46a5-49a8-8da8-3f47d303fd88","Type":"ContainerStarted","Data":"c9109a9958a76030c7b3ce1c753e756e05cc964ad0e70c01550167ebc4ed26bc"} Mar 07 21:38:33.759263 master-0 kubenswrapper[16352]: W0307 21:38:33.757665 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed655d4_c4c5_4dbf_86ff_a5ac776d428d.slice/crio-26b4208b35496574947c867bedd1dd17679dfce736ed7b9b51cc7b18b8d9febd WatchSource:0}: Error finding container 26b4208b35496574947c867bedd1dd17679dfce736ed7b9b51cc7b18b8d9febd: Status 404 returned error can't find the container with id 26b4208b35496574947c867bedd1dd17679dfce736ed7b9b51cc7b18b8d9febd Mar 07 21:38:33.760516 master-0 kubenswrapper[16352]: W0307 21:38:33.759762 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode58eb7fb_23b7_44ef_912c_d4dac0a71277.slice/crio-65a4a0ddbea58b5b63b2fd2b56ec9b6658ebc41053c19d59291e19482574679c WatchSource:0}: Error finding container 65a4a0ddbea58b5b63b2fd2b56ec9b6658ebc41053c19d59291e19482574679c: Status 404 returned error can't find the container with id 65a4a0ddbea58b5b63b2fd2b56ec9b6658ebc41053c19d59291e19482574679c Mar 07 21:38:34.360092 master-0 kubenswrapper[16352]: W0307 21:38:34.359978 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod733393c5_1380_4bb8_99e0_aa1757e573ab.slice/crio-2f5adac368b075e1547886846497128add8ee123a9068bd65783c7c8d4a3ead0 WatchSource:0}: Error finding container 2f5adac368b075e1547886846497128add8ee123a9068bd65783c7c8d4a3ead0: Status 404 returned error can't find the container with id 2f5adac368b075e1547886846497128add8ee123a9068bd65783c7c8d4a3ead0 Mar 07 21:38:34.361293 master-0 kubenswrapper[16352]: W0307 21:38:34.361177 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod848e3768_065a_4a9b_bc8c_eb34b6eb2560.slice/crio-bac2750d93d0a294a3d0fbb099690f9597e67bd7d1bf1232d9cc7a4a44b1cb15 WatchSource:0}: Error finding container bac2750d93d0a294a3d0fbb099690f9597e67bd7d1bf1232d9cc7a4a44b1cb15: Status 404 returned error can't find the container with id bac2750d93d0a294a3d0fbb099690f9597e67bd7d1bf1232d9cc7a4a44b1cb15 Mar 07 21:38:34.362663 master-0 kubenswrapper[16352]: W0307 21:38:34.362599 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcf92475a_b059_4a3d_8781_f3c49c45fde1.slice/crio-bea0b9666d47551276156ad4c9213482ae828c032b626588867c1429a742a33e WatchSource:0}: Error finding container bea0b9666d47551276156ad4c9213482ae828c032b626588867c1429a742a33e: Status 404 returned error can't find the container with id bea0b9666d47551276156ad4c9213482ae828c032b626588867c1429a742a33e Mar 07 21:38:34.759773 master-0 kubenswrapper[16352]: I0307 21:38:34.759515 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" event={"ID":"cf92475a-b059-4a3d-8781-f3c49c45fde1","Type":"ContainerStarted","Data":"bea0b9666d47551276156ad4c9213482ae828c032b626588867c1429a742a33e"} Mar 07 21:38:34.761789 master-0 kubenswrapper[16352]: I0307 21:38:34.761731 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" event={"ID":"eed655d4-c4c5-4dbf-86ff-a5ac776d428d","Type":"ContainerStarted","Data":"26b4208b35496574947c867bedd1dd17679dfce736ed7b9b51cc7b18b8d9febd"} Mar 07 21:38:34.764081 master-0 kubenswrapper[16352]: I0307 21:38:34.764038 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" event={"ID":"733393c5-1380-4bb8-99e0-aa1757e573ab","Type":"ContainerStarted","Data":"2f5adac368b075e1547886846497128add8ee123a9068bd65783c7c8d4a3ead0"} Mar 07 21:38:34.766038 master-0 kubenswrapper[16352]: I0307 21:38:34.765986 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" event={"ID":"848e3768-065a-4a9b-bc8c-eb34b6eb2560","Type":"ContainerStarted","Data":"bac2750d93d0a294a3d0fbb099690f9597e67bd7d1bf1232d9cc7a4a44b1cb15"} Mar 07 21:38:34.769200 master-0 kubenswrapper[16352]: I0307 21:38:34.769143 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" event={"ID":"e58eb7fb-23b7-44ef-912c-d4dac0a71277","Type":"ContainerStarted","Data":"65a4a0ddbea58b5b63b2fd2b56ec9b6658ebc41053c19d59291e19482574679c"} Mar 07 21:38:35.769131 master-0 kubenswrapper[16352]: I0307 21:38:35.769038 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:35.769948 master-0 kubenswrapper[16352]: E0307 21:38:35.769320 16352 secret.go:189] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:35.769948 master-0 kubenswrapper[16352]: E0307 21:38:35.769454 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert podName:410492a4-07d7-4b98-9c15-40200fe85474 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:43.769426542 +0000 UTC m=+1246.840131601 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert") pod "infra-operator-controller-manager-65b58d74b-rrd9h" (UID: "410492a4-07d7-4b98-9c15-40200fe85474") : secret "infra-operator-webhook-server-cert" not found Mar 07 21:38:36.077986 master-0 kubenswrapper[16352]: I0307 21:38:36.077831 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:36.078231 master-0 kubenswrapper[16352]: E0307 21:38:36.078085 16352 secret.go:189] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:36.078231 master-0 kubenswrapper[16352]: E0307 21:38:36.078211 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert podName:4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:44.078185749 +0000 UTC m=+1247.148890808 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert") pod "openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" (UID: "4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 07 21:38:36.491261 master-0 kubenswrapper[16352]: I0307 21:38:36.491182 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:36.491533 master-0 kubenswrapper[16352]: E0307 21:38:36.491457 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:36.491621 master-0 kubenswrapper[16352]: E0307 21:38:36.491572 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:44.491544852 +0000 UTC m=+1247.562249911 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:36.492248 master-0 kubenswrapper[16352]: I0307 21:38:36.492225 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:36.492405 master-0 kubenswrapper[16352]: E0307 21:38:36.492384 16352 secret.go:189] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 07 21:38:36.492475 master-0 kubenswrapper[16352]: E0307 21:38:36.492451 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:38:44.492440144 +0000 UTC m=+1247.563145363 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "metrics-server-cert" not found Mar 07 21:38:43.837959 master-0 kubenswrapper[16352]: I0307 21:38:43.836844 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:43.854632 master-0 kubenswrapper[16352]: I0307 21:38:43.841881 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/410492a4-07d7-4b98-9c15-40200fe85474-cert\") pod \"infra-operator-controller-manager-65b58d74b-rrd9h\" (UID: \"410492a4-07d7-4b98-9c15-40200fe85474\") " pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:43.991586 master-0 kubenswrapper[16352]: I0307 21:38:43.991448 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:44.143605 master-0 kubenswrapper[16352]: I0307 21:38:44.143384 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:44.148904 master-0 kubenswrapper[16352]: I0307 21:38:44.148527 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5-cert\") pod \"openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5\" (UID: \"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:44.213364 master-0 kubenswrapper[16352]: I0307 21:38:44.213210 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:44.555781 master-0 kubenswrapper[16352]: I0307 21:38:44.555223 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:44.555781 master-0 kubenswrapper[16352]: I0307 21:38:44.555419 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:44.555781 master-0 kubenswrapper[16352]: E0307 21:38:44.555637 16352 secret.go:189] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 07 21:38:44.555781 master-0 kubenswrapper[16352]: E0307 21:38:44.555756 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs podName:becc66df-cd3f-48e5-8684-f0cf6fabd257 nodeName:}" failed. No retries permitted until 2026-03-07 21:39:00.555735204 +0000 UTC m=+1263.626440283 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs") pod "openstack-operator-controller-manager-7dfcb4d64f-grrjr" (UID: "becc66df-cd3f-48e5-8684-f0cf6fabd257") : secret "webhook-server-cert" not found Mar 07 21:38:44.563140 master-0 kubenswrapper[16352]: I0307 21:38:44.563091 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-metrics-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:38:47.771783 master-0 kubenswrapper[16352]: I0307 21:38:47.771232 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h"] Mar 07 21:38:47.919600 master-0 kubenswrapper[16352]: I0307 21:38:47.916506 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5"] Mar 07 21:38:47.963231 master-0 kubenswrapper[16352]: I0307 21:38:47.959804 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" event={"ID":"349cead9-443f-4a05-a890-6a85791d27c4","Type":"ContainerStarted","Data":"3ec70dccc54eb5551e4f88883fcca4d53a27bf09ce092fc579ec8aeb91459c55"} Mar 07 21:38:47.963231 master-0 kubenswrapper[16352]: I0307 21:38:47.959917 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:47.963231 master-0 kubenswrapper[16352]: I0307 21:38:47.962226 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" event={"ID":"410492a4-07d7-4b98-9c15-40200fe85474","Type":"ContainerStarted","Data":"27c8f5ff126d614a1a7338476f62bb3caa9bf526e53206176c0c8e1d1196f7f2"} Mar 07 21:38:47.963820 master-0 kubenswrapper[16352]: I0307 21:38:47.963658 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" event={"ID":"c333624b-10f8-40cf-95ed-52fc6ccea867","Type":"ContainerStarted","Data":"21f65c1094f29d9016ab43c7d7eea0fc5e0a7b5d674d2ff69b8b653026f91bf0"} Mar 07 21:38:47.964393 master-0 kubenswrapper[16352]: I0307 21:38:47.964351 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:47.973329 master-0 kubenswrapper[16352]: I0307 21:38:47.973220 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" event={"ID":"dcba24dc-efc6-4686-9782-ac72720cbe35","Type":"ContainerStarted","Data":"382292bcbdf56cc9784f68d090b8af1c5e06c5469cf4264b6cfea8b83da45707"} Mar 07 21:38:47.973966 master-0 kubenswrapper[16352]: I0307 21:38:47.973943 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:47.979575 master-0 kubenswrapper[16352]: I0307 21:38:47.979519 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:47.981256 master-0 kubenswrapper[16352]: I0307 21:38:47.981201 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" event={"ID":"6d37d2c9-97c2-49ea-9629-aa17591de886","Type":"ContainerStarted","Data":"21993dbd819394972314ee9af8f5f34d6a9cdbcba20baa94e21ebee58b483996"} Mar 07 21:38:47.981703 master-0 kubenswrapper[16352]: I0307 21:38:47.981631 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:47.983431 master-0 kubenswrapper[16352]: I0307 21:38:47.983385 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" event={"ID":"354439b7-abea-4781-9fdd-835ef380edb5","Type":"ContainerStarted","Data":"6ac288e86609ab4d2fc5d7c8423baa483ecead721a63f4fe88065ede8a866b3d"} Mar 07 21:38:47.984021 master-0 kubenswrapper[16352]: I0307 21:38:47.983984 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:47.986440 master-0 kubenswrapper[16352]: I0307 21:38:47.986393 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" event={"ID":"b4e76285-f40c-4114-b9f1-fcf795be20cc","Type":"ContainerStarted","Data":"d23e425a63e7399dd30d139053f9983c1750d7866834f639d2738f89da40784e"} Mar 07 21:38:47.986575 master-0 kubenswrapper[16352]: I0307 21:38:47.986533 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:48.025077 master-0 kubenswrapper[16352]: I0307 21:38:48.021506 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" podStartSLOduration=8.220810559 podStartE2EDuration="21.021478475s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:28.583302698 +0000 UTC m=+1231.654007757" lastFinishedPulling="2026-03-07 21:38:41.383970614 +0000 UTC m=+1244.454675673" observedRunningTime="2026-03-07 21:38:48.012232313 +0000 UTC m=+1251.082937362" watchObservedRunningTime="2026-03-07 21:38:48.021478475 +0000 UTC m=+1251.092183534" Mar 07 21:38:48.025077 master-0 kubenswrapper[16352]: I0307 21:38:48.021624 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" podStartSLOduration=7.1749312419999995 podStartE2EDuration="21.021619199s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.641760256 +0000 UTC m=+1232.712465315" lastFinishedPulling="2026-03-07 21:38:43.488448203 +0000 UTC m=+1246.559153272" observedRunningTime="2026-03-07 21:38:47.9920288 +0000 UTC m=+1251.062733859" watchObservedRunningTime="2026-03-07 21:38:48.021619199 +0000 UTC m=+1251.092324258" Mar 07 21:38:48.066578 master-0 kubenswrapper[16352]: I0307 21:38:48.066376 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" podStartSLOduration=7.584073704 podStartE2EDuration="21.06635189s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:28.721316195 +0000 UTC m=+1231.792021254" lastFinishedPulling="2026-03-07 21:38:42.203594381 +0000 UTC m=+1245.274299440" observedRunningTime="2026-03-07 21:38:48.053148174 +0000 UTC m=+1251.123853233" watchObservedRunningTime="2026-03-07 21:38:48.06635189 +0000 UTC m=+1251.137056949" Mar 07 21:38:48.088547 master-0 kubenswrapper[16352]: I0307 21:38:48.088476 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" podStartSLOduration=5.548623858 podStartE2EDuration="21.08845765s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.732026178 +0000 UTC m=+1232.802731237" lastFinishedPulling="2026-03-07 21:38:45.27185996 +0000 UTC m=+1248.342565029" observedRunningTime="2026-03-07 21:38:48.08553149 +0000 UTC m=+1251.156236549" watchObservedRunningTime="2026-03-07 21:38:48.08845765 +0000 UTC m=+1251.159162699" Mar 07 21:38:48.183522 master-0 kubenswrapper[16352]: I0307 21:38:48.179357 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" podStartSLOduration=8.409334454 podStartE2EDuration="21.179330787s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:34.387141055 +0000 UTC m=+1237.457846144" lastFinishedPulling="2026-03-07 21:38:47.157137378 +0000 UTC m=+1250.227842477" observedRunningTime="2026-03-07 21:38:48.122656759 +0000 UTC m=+1251.193361818" watchObservedRunningTime="2026-03-07 21:38:48.179330787 +0000 UTC m=+1251.250035846" Mar 07 21:38:48.213637 master-0 kubenswrapper[16352]: I0307 21:38:48.213466 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" podStartSLOduration=8.049303049 podStartE2EDuration="21.213400503s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.039512398 +0000 UTC m=+1232.110217457" lastFinishedPulling="2026-03-07 21:38:42.203609852 +0000 UTC m=+1245.274314911" observedRunningTime="2026-03-07 21:38:48.16399748 +0000 UTC m=+1251.234702549" watchObservedRunningTime="2026-03-07 21:38:48.213400503 +0000 UTC m=+1251.284105582" Mar 07 21:38:48.249907 master-0 kubenswrapper[16352]: I0307 21:38:48.249824 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" podStartSLOduration=8.667681395 podStartE2EDuration="21.249789195s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.621709476 +0000 UTC m=+1232.692414535" lastFinishedPulling="2026-03-07 21:38:42.203817276 +0000 UTC m=+1245.274522335" observedRunningTime="2026-03-07 21:38:48.208589758 +0000 UTC m=+1251.279294837" watchObservedRunningTime="2026-03-07 21:38:48.249789195 +0000 UTC m=+1251.320494254" Mar 07 21:38:49.006392 master-0 kubenswrapper[16352]: I0307 21:38:49.006205 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" event={"ID":"90729f66-afc2-451e-b136-964efe09b675","Type":"ContainerStarted","Data":"135438eb4a2f1aa4b96eaabf5a8d414c810be1b8aa6a193b57fd68cc707401a5"} Mar 07 21:38:49.008916 master-0 kubenswrapper[16352]: I0307 21:38:49.008872 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:49.015628 master-0 kubenswrapper[16352]: I0307 21:38:49.015567 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" event={"ID":"9edc7b0f-b45b-4110-bb13-ea19ea3442c0","Type":"ContainerStarted","Data":"89eff62acde05ecb40b38c8b9da833efb315cff9f2f840d7991ac5dce082bd60"} Mar 07 21:38:49.019693 master-0 kubenswrapper[16352]: I0307 21:38:49.016853 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:49.042755 master-0 kubenswrapper[16352]: I0307 21:38:49.039194 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" event={"ID":"e98642c0-4c59-4b2a-98ba-7f8ae096c57f","Type":"ContainerStarted","Data":"ea89d789309353467d0f6134b3d57ea63cc4ffcc0a359e67c579924556c55c06"} Mar 07 21:38:49.046380 master-0 kubenswrapper[16352]: I0307 21:38:49.043360 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:49.052674 master-0 kubenswrapper[16352]: I0307 21:38:49.049678 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" event={"ID":"caf39759-7091-4ef5-8013-c7a487f72d76","Type":"ContainerStarted","Data":"92190d3ad8569258770ecce381fa937aaa6a0cb0da8c5c258a45777d80624e91"} Mar 07 21:38:49.052674 master-0 kubenswrapper[16352]: I0307 21:38:49.050549 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:49.069818 master-0 kubenswrapper[16352]: I0307 21:38:49.069657 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" event={"ID":"cf92475a-b059-4a3d-8781-f3c49c45fde1","Type":"ContainerStarted","Data":"17b896fef137b449344865f35d5d657ecc48b14a800deb41c4b5c82aa45e4083"} Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.093403 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" event={"ID":"eed655d4-c4c5-4dbf-86ff-a5ac776d428d","Type":"ContainerStarted","Data":"f28eaacf87e1b7a057f087cb82bc7de78afe68ebd2cd80325c1919c165c0e73d"} Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.093587 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.096781 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" event={"ID":"01456598-46a5-49a8-8da8-3f47d303fd88","Type":"ContainerStarted","Data":"cfdb4f77c571ba47ce71492e8b2bf6e21c2e27d548faac5aa3c24b2dd9c17088"} Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.097434 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.099195 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" podStartSLOduration=6.923162479 podStartE2EDuration="22.099156004s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:31.982651639 +0000 UTC m=+1235.053356698" lastFinishedPulling="2026-03-07 21:38:47.158645164 +0000 UTC m=+1250.229350223" observedRunningTime="2026-03-07 21:38:49.041926753 +0000 UTC m=+1252.112631812" watchObservedRunningTime="2026-03-07 21:38:49.099156004 +0000 UTC m=+1252.169861063" Mar 07 21:38:49.103716 master-0 kubenswrapper[16352]: I0307 21:38:49.103312 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" podStartSLOduration=8.932009548 podStartE2EDuration="22.103292444s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.032478769 +0000 UTC m=+1232.103183828" lastFinishedPulling="2026-03-07 21:38:42.203761665 +0000 UTC m=+1245.274466724" observedRunningTime="2026-03-07 21:38:49.090491386 +0000 UTC m=+1252.161196445" watchObservedRunningTime="2026-03-07 21:38:49.103292444 +0000 UTC m=+1252.173997503" Mar 07 21:38:49.109949 master-0 kubenswrapper[16352]: I0307 21:38:49.105208 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" event={"ID":"733393c5-1380-4bb8-99e0-aa1757e573ab","Type":"ContainerStarted","Data":"8a1e96344e5a84cfb9576a00e24c77dcc99b32728cb5586b7d435fde0b119732"} Mar 07 21:38:49.109949 master-0 kubenswrapper[16352]: I0307 21:38:49.105783 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:49.124527 master-0 kubenswrapper[16352]: I0307 21:38:49.110799 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" event={"ID":"e58eb7fb-23b7-44ef-912c-d4dac0a71277","Type":"ContainerStarted","Data":"c05ab30ba7439fe3d1730848b6be403c2fa0ee9fc4940e7d1fb3055355ba450d"} Mar 07 21:38:49.124527 master-0 kubenswrapper[16352]: I0307 21:38:49.111550 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:49.130259 master-0 kubenswrapper[16352]: I0307 21:38:49.130206 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" event={"ID":"848e3768-065a-4a9b-bc8c-eb34b6eb2560","Type":"ContainerStarted","Data":"5c32e3f4d9db1858ea9bc963a92f792f9b7ffab2c1660286f49dadf2c6d645cb"} Mar 07 21:38:49.133604 master-0 kubenswrapper[16352]: I0307 21:38:49.133571 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:49.139751 master-0 kubenswrapper[16352]: I0307 21:38:49.139671 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" event={"ID":"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5","Type":"ContainerStarted","Data":"663f27df9c82d902e5b3991c4c4b429fce6ce997203d86d7dbf3b61941d78ae6"} Mar 07 21:38:49.141503 master-0 kubenswrapper[16352]: I0307 21:38:49.141430 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" podStartSLOduration=4.752456474 podStartE2EDuration="22.141410017s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.695435132 +0000 UTC m=+1232.766140191" lastFinishedPulling="2026-03-07 21:38:47.084388675 +0000 UTC m=+1250.155093734" observedRunningTime="2026-03-07 21:38:49.137276288 +0000 UTC m=+1252.207981347" watchObservedRunningTime="2026-03-07 21:38:49.141410017 +0000 UTC m=+1252.212115076" Mar 07 21:38:49.145625 master-0 kubenswrapper[16352]: I0307 21:38:49.144421 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" event={"ID":"303ed055-819e-4c32-8f99-7feec7ded5e3","Type":"ContainerStarted","Data":"848e90823d59d2f8250113dff8dfedee3b3360aa8c34622a718bc8d9ebc23a71"} Mar 07 21:38:49.145625 master-0 kubenswrapper[16352]: I0307 21:38:49.145578 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:49.158782 master-0 kubenswrapper[16352]: I0307 21:38:49.155639 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" event={"ID":"04444749-e365-4e55-b556-245fb7b416a2","Type":"ContainerStarted","Data":"73fe5de04446178d1aa1e20e7f090a9f0c5bcbe186fa98fde88f5761877ae65c"} Mar 07 21:38:49.158782 master-0 kubenswrapper[16352]: I0307 21:38:49.156757 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:49.190322 master-0 kubenswrapper[16352]: I0307 21:38:49.182943 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" event={"ID":"60f0e1d8-44b8-4b6d-8804-0cc47734a848","Type":"ContainerStarted","Data":"9451a411287f59fd70be043fc5c423032cfcd2a8c0144e8c45c4364af027c431"} Mar 07 21:38:49.190322 master-0 kubenswrapper[16352]: I0307 21:38:49.187438 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:49.203991 master-0 kubenswrapper[16352]: I0307 21:38:49.203728 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" podStartSLOduration=8.237857616 podStartE2EDuration="22.203674218s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:33.191185903 +0000 UTC m=+1236.261890962" lastFinishedPulling="2026-03-07 21:38:47.157002505 +0000 UTC m=+1250.227707564" observedRunningTime="2026-03-07 21:38:49.196195579 +0000 UTC m=+1252.266900638" watchObservedRunningTime="2026-03-07 21:38:49.203674218 +0000 UTC m=+1252.274379277" Mar 07 21:38:49.254711 master-0 kubenswrapper[16352]: I0307 21:38:49.247233 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" event={"ID":"a1640d02-1d34-4805-9f58-de4973079a0d","Type":"ContainerStarted","Data":"48325d43d0a9106cd791894f3c94120450388ccee62e0d722f3878f51d6cef56"} Mar 07 21:38:49.254711 master-0 kubenswrapper[16352]: I0307 21:38:49.247508 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" podStartSLOduration=9.474621818 podStartE2EDuration="22.247481688s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:34.386880809 +0000 UTC m=+1237.457585908" lastFinishedPulling="2026-03-07 21:38:47.159740719 +0000 UTC m=+1250.230445778" observedRunningTime="2026-03-07 21:38:49.233955034 +0000 UTC m=+1252.304660093" watchObservedRunningTime="2026-03-07 21:38:49.247481688 +0000 UTC m=+1252.318186757" Mar 07 21:38:49.309548 master-0 kubenswrapper[16352]: I0307 21:38:49.309355 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" podStartSLOduration=9.539329148 podStartE2EDuration="22.30932994s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:34.387461073 +0000 UTC m=+1237.458166132" lastFinishedPulling="2026-03-07 21:38:47.157461825 +0000 UTC m=+1250.228166924" observedRunningTime="2026-03-07 21:38:49.280944679 +0000 UTC m=+1252.351649738" watchObservedRunningTime="2026-03-07 21:38:49.30932994 +0000 UTC m=+1252.380035009" Mar 07 21:38:49.338716 master-0 kubenswrapper[16352]: I0307 21:38:49.338519 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" podStartSLOduration=8.372733018 podStartE2EDuration="22.338486218s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:33.193908238 +0000 UTC m=+1236.264613297" lastFinishedPulling="2026-03-07 21:38:47.159661398 +0000 UTC m=+1250.230366497" observedRunningTime="2026-03-07 21:38:49.323627222 +0000 UTC m=+1252.394332281" watchObservedRunningTime="2026-03-07 21:38:49.338486218 +0000 UTC m=+1252.409191287" Mar 07 21:38:49.404708 master-0 kubenswrapper[16352]: I0307 21:38:49.404504 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" podStartSLOduration=8.936364711 podStartE2EDuration="22.404484379s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:33.761824034 +0000 UTC m=+1236.832529093" lastFinishedPulling="2026-03-07 21:38:47.229943682 +0000 UTC m=+1250.300648761" observedRunningTime="2026-03-07 21:38:49.373070196 +0000 UTC m=+1252.443775255" watchObservedRunningTime="2026-03-07 21:38:49.404484379 +0000 UTC m=+1252.475189438" Mar 07 21:38:49.418717 master-0 kubenswrapper[16352]: I0307 21:38:49.414422 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" podStartSLOduration=8.655442201 podStartE2EDuration="22.414396287s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.729951359 +0000 UTC m=+1232.800656418" lastFinishedPulling="2026-03-07 21:38:43.488905425 +0000 UTC m=+1246.559610504" observedRunningTime="2026-03-07 21:38:49.404728645 +0000 UTC m=+1252.475433704" watchObservedRunningTime="2026-03-07 21:38:49.414396287 +0000 UTC m=+1252.485101346" Mar 07 21:38:49.454711 master-0 kubenswrapper[16352]: I0307 21:38:49.452217 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" podStartSLOduration=9.202455516 podStartE2EDuration="22.452189472s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:28.953835885 +0000 UTC m=+1232.024540944" lastFinishedPulling="2026-03-07 21:38:42.203569841 +0000 UTC m=+1245.274274900" observedRunningTime="2026-03-07 21:38:49.431275191 +0000 UTC m=+1252.501980250" watchObservedRunningTime="2026-03-07 21:38:49.452189472 +0000 UTC m=+1252.522894541" Mar 07 21:38:49.499775 master-0 kubenswrapper[16352]: I0307 21:38:49.497125 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" podStartSLOduration=5.091701851 podStartE2EDuration="22.497099758s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:29.678926576 +0000 UTC m=+1232.749631645" lastFinishedPulling="2026-03-07 21:38:47.084324493 +0000 UTC m=+1250.155029552" observedRunningTime="2026-03-07 21:38:49.461102335 +0000 UTC m=+1252.531807394" watchObservedRunningTime="2026-03-07 21:38:49.497099758 +0000 UTC m=+1252.567804817" Mar 07 21:38:49.504716 master-0 kubenswrapper[16352]: I0307 21:38:49.502712 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" podStartSLOduration=9.036576202 podStartE2EDuration="22.502686912s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:33.762952251 +0000 UTC m=+1236.833657310" lastFinishedPulling="2026-03-07 21:38:47.229062961 +0000 UTC m=+1250.299768020" observedRunningTime="2026-03-07 21:38:49.489425734 +0000 UTC m=+1252.560130783" watchObservedRunningTime="2026-03-07 21:38:49.502686912 +0000 UTC m=+1252.573391971" Mar 07 21:38:49.521830 master-0 kubenswrapper[16352]: I0307 21:38:49.521729 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-r8xj9" podStartSLOduration=4.793281199 podStartE2EDuration="21.521701207s" podCreationTimestamp="2026-03-07 21:38:28 +0000 UTC" firstStartedPulling="2026-03-07 21:38:30.626300693 +0000 UTC m=+1233.697005752" lastFinishedPulling="2026-03-07 21:38:47.354720691 +0000 UTC m=+1250.425425760" observedRunningTime="2026-03-07 21:38:49.511050442 +0000 UTC m=+1252.581755511" watchObservedRunningTime="2026-03-07 21:38:49.521701207 +0000 UTC m=+1252.592406266" Mar 07 21:38:52.260673 master-0 kubenswrapper[16352]: I0307 21:38:52.260468 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" event={"ID":"4c6971cf-ab38-4ace-a7f2-06d8d1ccaca5","Type":"ContainerStarted","Data":"42ae6e5c7a8e3490f110f20c12bba7c0a1c1f93059597216cb829edc56ece1a7"} Mar 07 21:38:52.260673 master-0 kubenswrapper[16352]: I0307 21:38:52.260588 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:38:52.263825 master-0 kubenswrapper[16352]: I0307 21:38:52.262745 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" event={"ID":"410492a4-07d7-4b98-9c15-40200fe85474","Type":"ContainerStarted","Data":"c00640cef29557248ac8dc7dcbd620357efcc93fc07d333329fd9ad8553860af"} Mar 07 21:38:52.263825 master-0 kubenswrapper[16352]: I0307 21:38:52.262893 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:38:52.299049 master-0 kubenswrapper[16352]: I0307 21:38:52.298938 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" podStartSLOduration=21.308603415 podStartE2EDuration="25.298917414s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:47.971285653 +0000 UTC m=+1251.041990702" lastFinishedPulling="2026-03-07 21:38:51.961599642 +0000 UTC m=+1255.032304701" observedRunningTime="2026-03-07 21:38:52.287330866 +0000 UTC m=+1255.358035925" watchObservedRunningTime="2026-03-07 21:38:52.298917414 +0000 UTC m=+1255.369622473" Mar 07 21:38:52.314538 master-0 kubenswrapper[16352]: I0307 21:38:52.314425 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" podStartSLOduration=21.180187197 podStartE2EDuration="25.314395684s" podCreationTimestamp="2026-03-07 21:38:27 +0000 UTC" firstStartedPulling="2026-03-07 21:38:47.821788391 +0000 UTC m=+1250.892493450" lastFinishedPulling="2026-03-07 21:38:51.955996878 +0000 UTC m=+1255.026701937" observedRunningTime="2026-03-07 21:38:52.308955674 +0000 UTC m=+1255.379660743" watchObservedRunningTime="2026-03-07 21:38:52.314395684 +0000 UTC m=+1255.385100743" Mar 07 21:38:57.575477 master-0 kubenswrapper[16352]: I0307 21:38:57.575303 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-6db6876945-nlssq" Mar 07 21:38:57.622834 master-0 kubenswrapper[16352]: I0307 21:38:57.619202 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-55d77d7b5c-hjt7h" Mar 07 21:38:57.650131 master-0 kubenswrapper[16352]: I0307 21:38:57.650049 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-5d87c9d997-jzt22" Mar 07 21:38:57.753701 master-0 kubenswrapper[16352]: I0307 21:38:57.753338 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-64db6967f8-mq69x" Mar 07 21:38:57.756193 master-0 kubenswrapper[16352]: I0307 21:38:57.754782 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-cf99c678f-qmcr7" Mar 07 21:38:58.313281 master-0 kubenswrapper[16352]: I0307 21:38:58.313176 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-545456dc4-xth7w" Mar 07 21:38:58.377955 master-0 kubenswrapper[16352]: I0307 21:38:58.377870 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-78bc7f9bd9-rcxp2" Mar 07 21:38:58.378590 master-0 kubenswrapper[16352]: I0307 21:38:58.378519 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-7c789f89c6-zq79c" Mar 07 21:38:58.414102 master-0 kubenswrapper[16352]: I0307 21:38:58.414026 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-67d996989d-7ksrz" Mar 07 21:38:58.462855 master-0 kubenswrapper[16352]: I0307 21:38:58.462786 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-7b6bfb6475-j288g" Mar 07 21:38:58.477769 master-0 kubenswrapper[16352]: I0307 21:38:58.477702 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-54688575f-vj8dt" Mar 07 21:38:58.535964 master-0 kubenswrapper[16352]: I0307 21:38:58.534138 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-74b6b5dc96-ndppt" Mar 07 21:38:58.561061 master-0 kubenswrapper[16352]: I0307 21:38:58.559269 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5d86c7ddb7-2plwq" Mar 07 21:38:58.660582 master-0 kubenswrapper[16352]: I0307 21:38:58.660522 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-75684d597f-ccbn4" Mar 07 21:38:58.683900 master-0 kubenswrapper[16352]: I0307 21:38:58.683746 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-648564c9fc-l7256" Mar 07 21:38:58.757321 master-0 kubenswrapper[16352]: I0307 21:38:58.757269 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-9b9ff9f4d-s8tqw" Mar 07 21:38:58.784858 master-0 kubenswrapper[16352]: I0307 21:38:58.784744 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-bccc79885-k5rcm" Mar 07 21:38:59.055741 master-0 kubenswrapper[16352]: I0307 21:38:59.055572 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-55b5ff4dbb-9cpc2" Mar 07 21:38:59.079022 master-0 kubenswrapper[16352]: I0307 21:38:59.078948 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-5fdb694969-bbqxt" Mar 07 21:39:00.641166 master-0 kubenswrapper[16352]: I0307 21:39:00.641063 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:39:00.647879 master-0 kubenswrapper[16352]: I0307 21:39:00.646073 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/becc66df-cd3f-48e5-8684-f0cf6fabd257-webhook-certs\") pod \"openstack-operator-controller-manager-7dfcb4d64f-grrjr\" (UID: \"becc66df-cd3f-48e5-8684-f0cf6fabd257\") " pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:39:00.906985 master-0 kubenswrapper[16352]: I0307 21:39:00.906734 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:39:01.434416 master-0 kubenswrapper[16352]: W0307 21:39:01.434333 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbecc66df_cd3f_48e5_8684_f0cf6fabd257.slice/crio-7721628df92e05fd36c6dbc1d824feab528d6487a1f9e8ec398d0420150cd21c WatchSource:0}: Error finding container 7721628df92e05fd36c6dbc1d824feab528d6487a1f9e8ec398d0420150cd21c: Status 404 returned error can't find the container with id 7721628df92e05fd36c6dbc1d824feab528d6487a1f9e8ec398d0420150cd21c Mar 07 21:39:01.434966 master-0 kubenswrapper[16352]: I0307 21:39:01.434628 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr"] Mar 07 21:39:01.527017 master-0 kubenswrapper[16352]: I0307 21:39:01.526933 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" event={"ID":"becc66df-cd3f-48e5-8684-f0cf6fabd257","Type":"ContainerStarted","Data":"7721628df92e05fd36c6dbc1d824feab528d6487a1f9e8ec398d0420150cd21c"} Mar 07 21:39:02.545275 master-0 kubenswrapper[16352]: I0307 21:39:02.545186 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" event={"ID":"becc66df-cd3f-48e5-8684-f0cf6fabd257","Type":"ContainerStarted","Data":"a72e19dc33037f68ef59f70195160dce51338930ac35c54ecdc8b5e3bd73910f"} Mar 07 21:39:02.546111 master-0 kubenswrapper[16352]: I0307 21:39:02.545479 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:39:02.584920 master-0 kubenswrapper[16352]: I0307 21:39:02.584742 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" podStartSLOduration=34.584675038 podStartE2EDuration="34.584675038s" podCreationTimestamp="2026-03-07 21:38:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:39:02.582460036 +0000 UTC m=+1265.653165155" watchObservedRunningTime="2026-03-07 21:39:02.584675038 +0000 UTC m=+1265.655380147" Mar 07 21:39:04.004572 master-0 kubenswrapper[16352]: I0307 21:39:04.004464 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-65b58d74b-rrd9h" Mar 07 21:39:04.220242 master-0 kubenswrapper[16352]: I0307 21:39:04.220170 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-dc6dbbbd-xznm5" Mar 07 21:39:10.918529 master-0 kubenswrapper[16352]: I0307 21:39:10.918449 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-7dfcb4d64f-grrjr" Mar 07 21:39:56.578731 master-0 kubenswrapper[16352]: I0307 21:39:56.578168 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:39:56.584702 master-0 kubenswrapper[16352]: I0307 21:39:56.584618 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.588294 master-0 kubenswrapper[16352]: I0307 21:39:56.588206 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 07 21:39:56.592157 master-0 kubenswrapper[16352]: I0307 21:39:56.592082 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 07 21:39:56.592424 master-0 kubenswrapper[16352]: I0307 21:39:56.592288 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 07 21:39:56.604180 master-0 kubenswrapper[16352]: I0307 21:39:56.603906 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:39:56.683804 master-0 kubenswrapper[16352]: I0307 21:39:56.683725 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:39:56.688663 master-0 kubenswrapper[16352]: I0307 21:39:56.688604 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.692310 master-0 kubenswrapper[16352]: I0307 21:39:56.692215 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 07 21:39:56.706999 master-0 kubenswrapper[16352]: I0307 21:39:56.702237 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:39:56.755446 master-0 kubenswrapper[16352]: I0307 21:39:56.755369 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.755747 master-0 kubenswrapper[16352]: I0307 21:39:56.755561 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6vsl8\" (UniqueName: \"kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.857512 master-0 kubenswrapper[16352]: I0307 21:39:56.857439 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.857898 master-0 kubenswrapper[16352]: I0307 21:39:56.857570 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5754f\" (UniqueName: \"kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.857898 master-0 kubenswrapper[16352]: I0307 21:39:56.857646 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.857898 master-0 kubenswrapper[16352]: I0307 21:39:56.857783 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.857898 master-0 kubenswrapper[16352]: I0307 21:39:56.857869 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6vsl8\" (UniqueName: \"kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.859083 master-0 kubenswrapper[16352]: I0307 21:39:56.859064 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.876125 master-0 kubenswrapper[16352]: I0307 21:39:56.876069 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6vsl8\" (UniqueName: \"kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8\") pod \"dnsmasq-dns-69fd45f56f-msd9g\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.957300 master-0 kubenswrapper[16352]: I0307 21:39:56.957231 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:39:56.960073 master-0 kubenswrapper[16352]: I0307 21:39:56.960016 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.960292 master-0 kubenswrapper[16352]: I0307 21:39:56.960255 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5754f\" (UniqueName: \"kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.960365 master-0 kubenswrapper[16352]: I0307 21:39:56.960323 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.963704 master-0 kubenswrapper[16352]: I0307 21:39:56.961282 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.963704 master-0 kubenswrapper[16352]: I0307 21:39:56.961540 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:56.982422 master-0 kubenswrapper[16352]: I0307 21:39:56.981884 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5754f\" (UniqueName: \"kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f\") pod \"dnsmasq-dns-667b9d65dc-vfb6d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:57.025712 master-0 kubenswrapper[16352]: I0307 21:39:57.025225 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:39:57.539989 master-0 kubenswrapper[16352]: I0307 21:39:57.539886 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:39:57.543398 master-0 kubenswrapper[16352]: W0307 21:39:57.543338 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2c8bc6cc_9eef_42fa_bfe0_9d2a5d61a96c.slice/crio-01a683f26d7d0c797e69c66dd05f9316c8cc8b73f17f468bda8553d1e13da8a7 WatchSource:0}: Error finding container 01a683f26d7d0c797e69c66dd05f9316c8cc8b73f17f468bda8553d1e13da8a7: Status 404 returned error can't find the container with id 01a683f26d7d0c797e69c66dd05f9316c8cc8b73f17f468bda8553d1e13da8a7 Mar 07 21:39:57.571315 master-0 kubenswrapper[16352]: W0307 21:39:57.571245 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4628d7f0_d710_446c_a574_db5c172ff74d.slice/crio-86ba46fbec68245c740964eaf76f25f41e726abd518476c7a7f0ed0ab0fefb9f WatchSource:0}: Error finding container 86ba46fbec68245c740964eaf76f25f41e726abd518476c7a7f0ed0ab0fefb9f: Status 404 returned error can't find the container with id 86ba46fbec68245c740964eaf76f25f41e726abd518476c7a7f0ed0ab0fefb9f Mar 07 21:39:57.574644 master-0 kubenswrapper[16352]: I0307 21:39:57.574584 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:39:58.171018 master-0 kubenswrapper[16352]: I0307 21:39:58.169319 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" event={"ID":"4628d7f0-d710-446c-a574-db5c172ff74d","Type":"ContainerStarted","Data":"86ba46fbec68245c740964eaf76f25f41e726abd518476c7a7f0ed0ab0fefb9f"} Mar 07 21:39:58.172869 master-0 kubenswrapper[16352]: I0307 21:39:58.172086 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" event={"ID":"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c","Type":"ContainerStarted","Data":"01a683f26d7d0c797e69c66dd05f9316c8cc8b73f17f468bda8553d1e13da8a7"} Mar 07 21:39:59.233211 master-0 kubenswrapper[16352]: I0307 21:39:59.231079 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:39:59.305662 master-0 kubenswrapper[16352]: I0307 21:39:59.302993 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:39:59.306699 master-0 kubenswrapper[16352]: I0307 21:39:59.306098 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.328879 master-0 kubenswrapper[16352]: I0307 21:39:59.328801 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:39:59.432669 master-0 kubenswrapper[16352]: I0307 21:39:59.432557 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp82f\" (UniqueName: \"kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.433100 master-0 kubenswrapper[16352]: I0307 21:39:59.432981 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.433390 master-0 kubenswrapper[16352]: I0307 21:39:59.433340 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.537019 master-0 kubenswrapper[16352]: I0307 21:39:59.536912 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.537314 master-0 kubenswrapper[16352]: I0307 21:39:59.537190 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp82f\" (UniqueName: \"kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.537373 master-0 kubenswrapper[16352]: I0307 21:39:59.537317 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.539865 master-0 kubenswrapper[16352]: I0307 21:39:59.538748 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.539865 master-0 kubenswrapper[16352]: I0307 21:39:59.539225 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.597231 master-0 kubenswrapper[16352]: I0307 21:39:59.597137 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp82f\" (UniqueName: \"kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f\") pod \"dnsmasq-dns-7466868675-m4658\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.644803 master-0 kubenswrapper[16352]: I0307 21:39:59.642387 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:39:59.646229 master-0 kubenswrapper[16352]: I0307 21:39:59.646177 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:39:59.660644 master-0 kubenswrapper[16352]: I0307 21:39:59.657722 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:39:59.660644 master-0 kubenswrapper[16352]: I0307 21:39:59.659848 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.714914 master-0 kubenswrapper[16352]: I0307 21:39:59.712239 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:39:59.747922 master-0 kubenswrapper[16352]: I0307 21:39:59.747826 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gm9bj\" (UniqueName: \"kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.748222 master-0 kubenswrapper[16352]: I0307 21:39:59.747950 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.748222 master-0 kubenswrapper[16352]: I0307 21:39:59.748008 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.871492 master-0 kubenswrapper[16352]: I0307 21:39:59.867008 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.871492 master-0 kubenswrapper[16352]: I0307 21:39:59.867158 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gm9bj\" (UniqueName: \"kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.871492 master-0 kubenswrapper[16352]: I0307 21:39:59.867460 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.871492 master-0 kubenswrapper[16352]: I0307 21:39:59.869324 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.871492 master-0 kubenswrapper[16352]: I0307 21:39:59.870084 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:39:59.904862 master-0 kubenswrapper[16352]: I0307 21:39:59.904657 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gm9bj\" (UniqueName: \"kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj\") pod \"dnsmasq-dns-76ff7d945-qtbgb\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:40:00.001618 master-0 kubenswrapper[16352]: I0307 21:40:00.001540 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:40:00.285933 master-0 kubenswrapper[16352]: W0307 21:40:00.285859 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36578f90_945d_4f93_ac4e_346ff30e9119.slice/crio-049dc9ea02e55a40510288684d6488f5903a5720f0d275ce610ebf2ce135b134 WatchSource:0}: Error finding container 049dc9ea02e55a40510288684d6488f5903a5720f0d275ce610ebf2ce135b134: Status 404 returned error can't find the container with id 049dc9ea02e55a40510288684d6488f5903a5720f0d275ce610ebf2ce135b134 Mar 07 21:40:00.288341 master-0 kubenswrapper[16352]: I0307 21:40:00.288302 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:40:00.602960 master-0 kubenswrapper[16352]: I0307 21:40:00.602831 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:40:01.240300 master-0 kubenswrapper[16352]: I0307 21:40:01.240216 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-m4658" event={"ID":"36578f90-945d-4f93-ac4e-346ff30e9119","Type":"ContainerStarted","Data":"049dc9ea02e55a40510288684d6488f5903a5720f0d275ce610ebf2ce135b134"} Mar 07 21:40:01.242574 master-0 kubenswrapper[16352]: I0307 21:40:01.242502 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" event={"ID":"af1ce916-8e43-4899-9c97-9aba5f6c5679","Type":"ContainerStarted","Data":"4f0ed74801411574e69742a8963bfcf22ec3d9f72fb4577f499a464241086b59"} Mar 07 21:40:03.420966 master-0 kubenswrapper[16352]: I0307 21:40:03.420841 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 21:40:03.423398 master-0 kubenswrapper[16352]: I0307 21:40:03.423358 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.432015 master-0 kubenswrapper[16352]: I0307 21:40:03.431917 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 07 21:40:03.434476 master-0 kubenswrapper[16352]: I0307 21:40:03.432305 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 07 21:40:03.434476 master-0 kubenswrapper[16352]: I0307 21:40:03.432522 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 07 21:40:03.434476 master-0 kubenswrapper[16352]: I0307 21:40:03.432661 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 07 21:40:03.434476 master-0 kubenswrapper[16352]: I0307 21:40:03.432805 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 07 21:40:03.441310 master-0 kubenswrapper[16352]: I0307 21:40:03.437309 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 07 21:40:03.446704 master-0 kubenswrapper[16352]: I0307 21:40:03.446594 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 21:40:03.515609 master-0 kubenswrapper[16352]: I0307 21:40:03.515518 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.515609 master-0 kubenswrapper[16352]: I0307 21:40:03.515605 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-06462067-2ded-43d7-a02a-43211f51676a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb8ce72d-2724-4ef9-bdd7-a749f0b37d70\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.515609 master-0 kubenswrapper[16352]: I0307 21:40:03.515628 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515655 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xtc7\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-kube-api-access-6xtc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515702 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515738 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515772 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f025883-7fbd-4887-9328-36ba8b9c326b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515791 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f025883-7fbd-4887-9328-36ba8b9c326b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515823 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515857 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.516406 master-0 kubenswrapper[16352]: I0307 21:40:03.515877 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.598713 master-0 kubenswrapper[16352]: I0307 21:40:03.589757 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 07 21:40:03.598713 master-0 kubenswrapper[16352]: I0307 21:40:03.593420 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 21:40:03.628853 master-0 kubenswrapper[16352]: I0307 21:40:03.623706 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 07 21:40:03.628853 master-0 kubenswrapper[16352]: I0307 21:40:03.624569 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.630929 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.630997 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-06462067-2ded-43d7-a02a-43211f51676a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb8ce72d-2724-4ef9-bdd7-a749f0b37d70\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.631015 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.631042 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xtc7\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-kube-api-access-6xtc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.632255 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633581 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633639 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633735 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633795 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f025883-7fbd-4887-9328-36ba8b9c326b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633817 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f025883-7fbd-4887-9328-36ba8b9c326b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.633847 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.634501 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.634557 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.636383 master-0 kubenswrapper[16352]: I0307 21:40:03.634586 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.639217 master-0 kubenswrapper[16352]: I0307 21:40:03.639164 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:03.639295 master-0 kubenswrapper[16352]: I0307 21:40:03.639244 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-06462067-2ded-43d7-a02a-43211f51676a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb8ce72d-2724-4ef9-bdd7-a749f0b37d70\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/ad9833a0e802107fd3c513bf9e058c0080d45bc93745a006fc146e3349baaa0d/globalmount\"" pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.645727 master-0 kubenswrapper[16352]: I0307 21:40:03.641027 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.686148 master-0 kubenswrapper[16352]: I0307 21:40:03.684856 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/6f025883-7fbd-4887-9328-36ba8b9c326b-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.686148 master-0 kubenswrapper[16352]: I0307 21:40:03.685429 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/6f025883-7fbd-4887-9328-36ba8b9c326b-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.691720 master-0 kubenswrapper[16352]: I0307 21:40:03.688368 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.698763 master-0 kubenswrapper[16352]: I0307 21:40:03.693343 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/6f025883-7fbd-4887-9328-36ba8b9c326b-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.708897 master-0 kubenswrapper[16352]: I0307 21:40:03.707251 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xtc7\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-kube-api-access-6xtc7\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.742929 master-0 kubenswrapper[16352]: I0307 21:40:03.742846 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.742929 master-0 kubenswrapper[16352]: I0307 21:40:03.742943 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-config-data\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.743304 master-0 kubenswrapper[16352]: I0307 21:40:03.743041 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kkkvt\" (UniqueName: \"kubernetes.io/projected/ea79a140-1767-4d8d-b766-fd36a08926da-kube-api-access-kkkvt\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.743304 master-0 kubenswrapper[16352]: I0307 21:40:03.743156 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.743304 master-0 kubenswrapper[16352]: I0307 21:40:03.743186 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-kolla-config\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.743304 master-0 kubenswrapper[16352]: I0307 21:40:03.743246 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 07 21:40:03.757635 master-0 kubenswrapper[16352]: I0307 21:40:03.757559 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/6f025883-7fbd-4887-9328-36ba8b9c326b-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:03.793397 master-0 kubenswrapper[16352]: I0307 21:40:03.793305 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 07 21:40:03.858739 master-0 kubenswrapper[16352]: I0307 21:40:03.857207 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.858739 master-0 kubenswrapper[16352]: I0307 21:40:03.857267 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-kolla-config\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.858739 master-0 kubenswrapper[16352]: I0307 21:40:03.857335 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.858739 master-0 kubenswrapper[16352]: I0307 21:40:03.857367 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-config-data\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.858739 master-0 kubenswrapper[16352]: I0307 21:40:03.857439 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kkkvt\" (UniqueName: \"kubernetes.io/projected/ea79a140-1767-4d8d-b766-fd36a08926da-kube-api-access-kkkvt\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.882920 master-0 kubenswrapper[16352]: I0307 21:40:03.880874 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-kolla-config\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.892882 master-0 kubenswrapper[16352]: I0307 21:40:03.891113 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-combined-ca-bundle\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.892882 master-0 kubenswrapper[16352]: I0307 21:40:03.891186 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ea79a140-1767-4d8d-b766-fd36a08926da-config-data\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.924800 master-0 kubenswrapper[16352]: I0307 21:40:03.923593 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/ea79a140-1767-4d8d-b766-fd36a08926da-memcached-tls-certs\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.943882 master-0 kubenswrapper[16352]: I0307 21:40:03.942713 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kkkvt\" (UniqueName: \"kubernetes.io/projected/ea79a140-1767-4d8d-b766-fd36a08926da-kube-api-access-kkkvt\") pod \"memcached-0\" (UID: \"ea79a140-1767-4d8d-b766-fd36a08926da\") " pod="openstack/memcached-0" Mar 07 21:40:03.963072 master-0 kubenswrapper[16352]: I0307 21:40:03.959835 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 21:40:03.963072 master-0 kubenswrapper[16352]: I0307 21:40:03.962084 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 21:40:03.969793 master-0 kubenswrapper[16352]: I0307 21:40:03.965455 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 07 21:40:03.969793 master-0 kubenswrapper[16352]: I0307 21:40:03.967035 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 07 21:40:03.969793 master-0 kubenswrapper[16352]: I0307 21:40:03.967213 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 07 21:40:03.969793 master-0 kubenswrapper[16352]: I0307 21:40:03.967464 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 07 21:40:03.969793 master-0 kubenswrapper[16352]: I0307 21:40:03.967586 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 07 21:40:03.973343 master-0 kubenswrapper[16352]: I0307 21:40:03.973235 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 07 21:40:03.984661 master-0 kubenswrapper[16352]: I0307 21:40:03.984539 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 21:40:04.061175 master-0 kubenswrapper[16352]: I0307 21:40:04.061081 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061175 master-0 kubenswrapper[16352]: I0307 21:40:04.061180 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d2620a2-19d9-4543-922c-dc7951734958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061229 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqpsv\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-kube-api-access-dqpsv\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061291 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-b3f8d3b9-5cf0-4c92-812a-cc03c36d27f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^467e20fe-29e3-4ece-b812-7d9ab2da48c3\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061331 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061354 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d2620a2-19d9-4543-922c-dc7951734958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061381 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061406 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061429 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-config-data\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061483 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.061522 master-0 kubenswrapper[16352]: I0307 21:40:04.061502 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163197 master-0 kubenswrapper[16352]: I0307 21:40:04.163114 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163197 master-0 kubenswrapper[16352]: I0307 21:40:04.163188 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-config-data\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163275 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163298 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163325 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163360 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d2620a2-19d9-4543-922c-dc7951734958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163395 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dqpsv\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-kube-api-access-dqpsv\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163449 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-b3f8d3b9-5cf0-4c92-812a-cc03c36d27f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^467e20fe-29e3-4ece-b812-7d9ab2da48c3\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163463 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163479 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d2620a2-19d9-4543-922c-dc7951734958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.163646 master-0 kubenswrapper[16352]: I0307 21:40:04.163501 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.165457 master-0 kubenswrapper[16352]: I0307 21:40:04.164509 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.165457 master-0 kubenswrapper[16352]: I0307 21:40:04.165336 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.168441 master-0 kubenswrapper[16352]: I0307 21:40:04.167512 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-server-conf\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.168441 master-0 kubenswrapper[16352]: I0307 21:40:04.168377 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/4d2620a2-19d9-4543-922c-dc7951734958-config-data\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.169328 master-0 kubenswrapper[16352]: I0307 21:40:04.168641 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.169765 master-0 kubenswrapper[16352]: I0307 21:40:04.169547 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.170410 master-0 kubenswrapper[16352]: I0307 21:40:04.169927 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/4d2620a2-19d9-4543-922c-dc7951734958-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.170479 master-0 kubenswrapper[16352]: I0307 21:40:04.170434 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:04.170532 master-0 kubenswrapper[16352]: I0307 21:40:04.170472 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-b3f8d3b9-5cf0-4c92-812a-cc03c36d27f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^467e20fe-29e3-4ece-b812-7d9ab2da48c3\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/abebdb6be5fc71a6179a05f0817591056708fc61a4820035e6db9a4b1b62f81b/globalmount\"" pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.178449 master-0 kubenswrapper[16352]: I0307 21:40:04.177372 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 07 21:40:04.181501 master-0 kubenswrapper[16352]: I0307 21:40:04.181467 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/4d2620a2-19d9-4543-922c-dc7951734958-pod-info\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.184083 master-0 kubenswrapper[16352]: I0307 21:40:04.184035 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:04.193786 master-0 kubenswrapper[16352]: I0307 21:40:04.193718 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dqpsv\" (UniqueName: \"kubernetes.io/projected/4d2620a2-19d9-4543-922c-dc7951734958-kube-api-access-dqpsv\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:05.450245 master-0 kubenswrapper[16352]: I0307 21:40:05.448499 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-06462067-2ded-43d7-a02a-43211f51676a\" (UniqueName: \"kubernetes.io/csi/topolvm.io^eb8ce72d-2724-4ef9-bdd7-a749f0b37d70\") pod \"rabbitmq-cell1-server-0\" (UID: \"6f025883-7fbd-4887-9328-36ba8b9c326b\") " pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:05.583225 master-0 kubenswrapper[16352]: I0307 21:40:05.583034 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:40:05.754968 master-0 kubenswrapper[16352]: I0307 21:40:05.754864 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 07 21:40:05.757033 master-0 kubenswrapper[16352]: I0307 21:40:05.757005 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 21:40:05.772135 master-0 kubenswrapper[16352]: I0307 21:40:05.766273 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 07 21:40:05.772135 master-0 kubenswrapper[16352]: I0307 21:40:05.766554 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 07 21:40:05.772135 master-0 kubenswrapper[16352]: I0307 21:40:05.768724 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 07 21:40:05.793567 master-0 kubenswrapper[16352]: I0307 21:40:05.783249 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 21:40:05.818189 master-0 kubenswrapper[16352]: I0307 21:40:05.818114 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818189 master-0 kubenswrapper[16352]: I0307 21:40:05.818187 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n6jll\" (UniqueName: \"kubernetes.io/projected/1bc36cee-aa13-4fd2-873c-892c54978add-kube-api-access-n6jll\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818231 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818299 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-default\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818318 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818368 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-40399d5b-ef3b-4708-abec-33eea3352bc1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^811b03ad-2ebc-4e34-9a19-315e6d296fcd\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818448 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-kolla-config\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.818597 master-0 kubenswrapper[16352]: I0307 21:40:05.818491 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920152 master-0 kubenswrapper[16352]: I0307 21:40:05.920063 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920152 master-0 kubenswrapper[16352]: I0307 21:40:05.920154 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920177 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n6jll\" (UniqueName: \"kubernetes.io/projected/1bc36cee-aa13-4fd2-873c-892c54978add-kube-api-access-n6jll\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920218 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920270 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920290 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-default\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920340 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-40399d5b-ef3b-4708-abec-33eea3352bc1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^811b03ad-2ebc-4e34-9a19-315e6d296fcd\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.920577 master-0 kubenswrapper[16352]: I0307 21:40:05.920397 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-kolla-config\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.922667 master-0 kubenswrapper[16352]: I0307 21:40:05.922571 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-default\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.922912 master-0 kubenswrapper[16352]: I0307 21:40:05.922861 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/1bc36cee-aa13-4fd2-873c-892c54978add-config-data-generated\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.923208 master-0 kubenswrapper[16352]: I0307 21:40:05.923156 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-kolla-config\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.923208 master-0 kubenswrapper[16352]: I0307 21:40:05.922935 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1bc36cee-aa13-4fd2-873c-892c54978add-operator-scripts\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.923667 master-0 kubenswrapper[16352]: I0307 21:40:05.923634 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:05.923762 master-0 kubenswrapper[16352]: I0307 21:40:05.923669 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-40399d5b-ef3b-4708-abec-33eea3352bc1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^811b03ad-2ebc-4e34-9a19-315e6d296fcd\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/f0b0521a8cf0e71fae14d27119da673e85fb0627843638124b2c20ef616a91c4/globalmount\"" pod="openstack/openstack-galera-0" Mar 07 21:40:05.935710 master-0 kubenswrapper[16352]: I0307 21:40:05.935619 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.938801 master-0 kubenswrapper[16352]: I0307 21:40:05.938667 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n6jll\" (UniqueName: \"kubernetes.io/projected/1bc36cee-aa13-4fd2-873c-892c54978add-kube-api-access-n6jll\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:05.939034 master-0 kubenswrapper[16352]: I0307 21:40:05.938960 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/1bc36cee-aa13-4fd2-873c-892c54978add-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:06.228621 master-0 kubenswrapper[16352]: I0307 21:40:06.228442 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 21:40:06.232385 master-0 kubenswrapper[16352]: I0307 21:40:06.231131 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.236390 master-0 kubenswrapper[16352]: I0307 21:40:06.236327 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 07 21:40:06.248033 master-0 kubenswrapper[16352]: I0307 21:40:06.237192 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 07 21:40:06.248033 master-0 kubenswrapper[16352]: I0307 21:40:06.237334 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 07 21:40:06.253143 master-0 kubenswrapper[16352]: I0307 21:40:06.253057 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 21:40:06.340400 master-0 kubenswrapper[16352]: I0307 21:40:06.340312 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.340443 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.340517 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw4n8\" (UniqueName: \"kubernetes.io/projected/90e6075f-5d67-4d68-a26b-621590c4ca33-kube-api-access-kw4n8\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.340556 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.340772 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.341010 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.341058 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-d4c85bc6-4ba4-41f8-ac2f-4164794b47c9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b92da40-da2c-4114-b24c-a53afb0c1e85\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.341222 master-0 kubenswrapper[16352]: I0307 21:40:06.341090 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.445595 master-0 kubenswrapper[16352]: I0307 21:40:06.445447 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445614 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw4n8\" (UniqueName: \"kubernetes.io/projected/90e6075f-5d67-4d68-a26b-621590c4ca33-kube-api-access-kw4n8\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445724 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445809 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445898 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445935 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-d4c85bc6-4ba4-41f8-ac2f-4164794b47c9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b92da40-da2c-4114-b24c-a53afb0c1e85\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446066 master-0 kubenswrapper[16352]: I0307 21:40:06.445966 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446387 master-0 kubenswrapper[16352]: I0307 21:40:06.446090 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.446387 master-0 kubenswrapper[16352]: I0307 21:40:06.446332 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.447504 master-0 kubenswrapper[16352]: I0307 21:40:06.447449 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.449959 master-0 kubenswrapper[16352]: I0307 21:40:06.449916 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.450497 master-0 kubenswrapper[16352]: I0307 21:40:06.450454 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:06.451010 master-0 kubenswrapper[16352]: I0307 21:40:06.450512 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-d4c85bc6-4ba4-41f8-ac2f-4164794b47c9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b92da40-da2c-4114-b24c-a53afb0c1e85\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/40e4ef6f7fc18de85d860dfaab66282f924b294aba7fca54fed5ed387631af0f/globalmount\"" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.452625 master-0 kubenswrapper[16352]: I0307 21:40:06.452576 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.452735 master-0 kubenswrapper[16352]: I0307 21:40:06.452594 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/90e6075f-5d67-4d68-a26b-621590c4ca33-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.453174 master-0 kubenswrapper[16352]: I0307 21:40:06.453131 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/90e6075f-5d67-4d68-a26b-621590c4ca33-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.478078 master-0 kubenswrapper[16352]: I0307 21:40:06.478006 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw4n8\" (UniqueName: \"kubernetes.io/projected/90e6075f-5d67-4d68-a26b-621590c4ca33-kube-api-access-kw4n8\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:06.778471 master-0 kubenswrapper[16352]: I0307 21:40:06.778410 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-b3f8d3b9-5cf0-4c92-812a-cc03c36d27f4\" (UniqueName: \"kubernetes.io/csi/topolvm.io^467e20fe-29e3-4ece-b812-7d9ab2da48c3\") pod \"rabbitmq-server-0\" (UID: \"4d2620a2-19d9-4543-922c-dc7951734958\") " pod="openstack/rabbitmq-server-0" Mar 07 21:40:07.009780 master-0 kubenswrapper[16352]: I0307 21:40:07.008590 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 07 21:40:08.733321 master-0 kubenswrapper[16352]: I0307 21:40:08.733215 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-40399d5b-ef3b-4708-abec-33eea3352bc1\" (UniqueName: \"kubernetes.io/csi/topolvm.io^811b03ad-2ebc-4e34-9a19-315e6d296fcd\") pod \"openstack-galera-0\" (UID: \"1bc36cee-aa13-4fd2-873c-892c54978add\") " pod="openstack/openstack-galera-0" Mar 07 21:40:09.403400 master-0 kubenswrapper[16352]: I0307 21:40:09.403255 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 07 21:40:09.773754 master-0 kubenswrapper[16352]: I0307 21:40:09.773635 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-d4c85bc6-4ba4-41f8-ac2f-4164794b47c9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6b92da40-da2c-4114-b24c-a53afb0c1e85\") pod \"openstack-cell1-galera-0\" (UID: \"90e6075f-5d67-4d68-a26b-621590c4ca33\") " pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:10.194952 master-0 kubenswrapper[16352]: I0307 21:40:10.194805 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:11.448713 master-0 kubenswrapper[16352]: I0307 21:40:11.447591 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wptpb"] Mar 07 21:40:11.451940 master-0 kubenswrapper[16352]: I0307 21:40:11.450164 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.453814 master-0 kubenswrapper[16352]: I0307 21:40:11.453459 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 07 21:40:11.455839 master-0 kubenswrapper[16352]: I0307 21:40:11.455805 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 07 21:40:11.465980 master-0 kubenswrapper[16352]: I0307 21:40:11.465914 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wptpb"] Mar 07 21:40:11.495946 master-0 kubenswrapper[16352]: I0307 21:40:11.495897 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-csxfx"] Mar 07 21:40:11.501957 master-0 kubenswrapper[16352]: I0307 21:40:11.501911 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.506808 master-0 kubenswrapper[16352]: I0307 21:40:11.506752 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-csxfx"] Mar 07 21:40:11.637407 master-0 kubenswrapper[16352]: I0307 21:40:11.636836 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-scripts\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.639595 master-0 kubenswrapper[16352]: I0307 21:40:11.639522 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xqrg\" (UniqueName: \"kubernetes.io/projected/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-kube-api-access-7xqrg\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.640092 master-0 kubenswrapper[16352]: I0307 21:40:11.640066 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-log-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.640214 master-0 kubenswrapper[16352]: I0307 21:40:11.640196 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-etc-ovs\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.640356 master-0 kubenswrapper[16352]: I0307 21:40:11.640337 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-log\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.640592 master-0 kubenswrapper[16352]: I0307 21:40:11.640570 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-ovn-controller-tls-certs\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.640819 master-0 kubenswrapper[16352]: I0307 21:40:11.640780 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-lib\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.640982 master-0 kubenswrapper[16352]: I0307 21:40:11.640962 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-run\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.641211 master-0 kubenswrapper[16352]: I0307 21:40:11.641192 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd69bc9d-5b53-4868-bfab-b3956e54600d-scripts\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.641399 master-0 kubenswrapper[16352]: I0307 21:40:11.641381 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.642671 master-0 kubenswrapper[16352]: I0307 21:40:11.642420 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.642671 master-0 kubenswrapper[16352]: I0307 21:40:11.642510 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-combined-ca-bundle\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.642910 master-0 kubenswrapper[16352]: I0307 21:40:11.642821 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pstn2\" (UniqueName: \"kubernetes.io/projected/cd69bc9d-5b53-4868-bfab-b3956e54600d-kube-api-access-pstn2\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.749472 master-0 kubenswrapper[16352]: I0307 21:40:11.749233 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-scripts\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.750542 master-0 kubenswrapper[16352]: I0307 21:40:11.750512 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xqrg\" (UniqueName: \"kubernetes.io/projected/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-kube-api-access-7xqrg\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.751149 master-0 kubenswrapper[16352]: I0307 21:40:11.751127 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-log-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751322 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-etc-ovs\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751404 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-log\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751488 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-lib\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751759 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-ovn-controller-tls-certs\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751831 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-run\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751912 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd69bc9d-5b53-4868-bfab-b3956e54600d-scripts\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.751996 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752105 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752197 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-combined-ca-bundle\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752363 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pstn2\" (UniqueName: \"kubernetes.io/projected/cd69bc9d-5b53-4868-bfab-b3956e54600d-kube-api-access-pstn2\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752441 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-log-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752529 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-scripts\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.752916 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-log\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.753351 master-0 kubenswrapper[16352]: I0307 21:40:11.753207 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-etc-ovs\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.758254 master-0 kubenswrapper[16352]: I0307 21:40:11.753376 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-lib\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.758254 master-0 kubenswrapper[16352]: I0307 21:40:11.753458 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.758254 master-0 kubenswrapper[16352]: I0307 21:40:11.753617 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-var-run-ovn\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.758254 master-0 kubenswrapper[16352]: I0307 21:40:11.755953 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/cd69bc9d-5b53-4868-bfab-b3956e54600d-scripts\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.758254 master-0 kubenswrapper[16352]: I0307 21:40:11.756193 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/cd69bc9d-5b53-4868-bfab-b3956e54600d-var-run\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.760163 master-0 kubenswrapper[16352]: I0307 21:40:11.760136 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-ovn-controller-tls-certs\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.760283 master-0 kubenswrapper[16352]: I0307 21:40:11.760259 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-combined-ca-bundle\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.768038 master-0 kubenswrapper[16352]: I0307 21:40:11.767531 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pstn2\" (UniqueName: \"kubernetes.io/projected/cd69bc9d-5b53-4868-bfab-b3956e54600d-kube-api-access-pstn2\") pod \"ovn-controller-ovs-csxfx\" (UID: \"cd69bc9d-5b53-4868-bfab-b3956e54600d\") " pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:11.770426 master-0 kubenswrapper[16352]: I0307 21:40:11.770352 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xqrg\" (UniqueName: \"kubernetes.io/projected/59890fa2-937c-4fc1-9f3d-6c2297a9d46b-kube-api-access-7xqrg\") pod \"ovn-controller-wptpb\" (UID: \"59890fa2-937c-4fc1-9f3d-6c2297a9d46b\") " pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.803960 master-0 kubenswrapper[16352]: I0307 21:40:11.803874 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb" Mar 07 21:40:11.843729 master-0 kubenswrapper[16352]: I0307 21:40:11.843633 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:12.433483 master-0 kubenswrapper[16352]: I0307 21:40:12.432640 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 21:40:12.445069 master-0 kubenswrapper[16352]: I0307 21:40:12.444638 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.450157 master-0 kubenswrapper[16352]: I0307 21:40:12.450072 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 07 21:40:12.450940 master-0 kubenswrapper[16352]: I0307 21:40:12.450862 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 07 21:40:12.451074 master-0 kubenswrapper[16352]: I0307 21:40:12.451035 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 07 21:40:12.451226 master-0 kubenswrapper[16352]: I0307 21:40:12.450244 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 07 21:40:12.464393 master-0 kubenswrapper[16352]: I0307 21:40:12.464326 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 21:40:12.484638 master-0 kubenswrapper[16352]: I0307 21:40:12.484543 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.485020 master-0 kubenswrapper[16352]: I0307 21:40:12.484796 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-config\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.485091 master-0 kubenswrapper[16352]: I0307 21:40:12.485058 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.485151 master-0 kubenswrapper[16352]: I0307 21:40:12.485127 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.487833 master-0 kubenswrapper[16352]: I0307 21:40:12.485761 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.487833 master-0 kubenswrapper[16352]: I0307 21:40:12.485822 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dps65\" (UniqueName: \"kubernetes.io/projected/f284e621-c075-47ba-9d59-f6491ac0698a-kube-api-access-dps65\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.487833 master-0 kubenswrapper[16352]: I0307 21:40:12.485982 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.487833 master-0 kubenswrapper[16352]: I0307 21:40:12.486192 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-acc72ec2-5af2-41cb-8898-db335c63aa17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^740beddd-2e2e-4111-b61d-e459bd199e7b\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.589809 master-0 kubenswrapper[16352]: I0307 21:40:12.589720 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-acc72ec2-5af2-41cb-8898-db335c63aa17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^740beddd-2e2e-4111-b61d-e459bd199e7b\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590141 master-0 kubenswrapper[16352]: I0307 21:40:12.589900 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590141 master-0 kubenswrapper[16352]: I0307 21:40:12.589946 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-config\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590141 master-0 kubenswrapper[16352]: I0307 21:40:12.590007 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590141 master-0 kubenswrapper[16352]: I0307 21:40:12.590044 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590141 master-0 kubenswrapper[16352]: I0307 21:40:12.590112 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590398 master-0 kubenswrapper[16352]: I0307 21:40:12.590147 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dps65\" (UniqueName: \"kubernetes.io/projected/f284e621-c075-47ba-9d59-f6491ac0698a-kube-api-access-dps65\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.590398 master-0 kubenswrapper[16352]: I0307 21:40:12.590257 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.591840 master-0 kubenswrapper[16352]: I0307 21:40:12.591794 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-config\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.592161 master-0 kubenswrapper[16352]: I0307 21:40:12.592134 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.592402 master-0 kubenswrapper[16352]: I0307 21:40:12.592338 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f284e621-c075-47ba-9d59-f6491ac0698a-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.593387 master-0 kubenswrapper[16352]: I0307 21:40:12.593339 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:12.593615 master-0 kubenswrapper[16352]: I0307 21:40:12.593409 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-acc72ec2-5af2-41cb-8898-db335c63aa17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^740beddd-2e2e-4111-b61d-e459bd199e7b\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/64d72b96202654ef95b2fd8df830f89b9472017e233fb98812df0fd0e135ec99/globalmount\"" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.595651 master-0 kubenswrapper[16352]: I0307 21:40:12.595627 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.596147 master-0 kubenswrapper[16352]: I0307 21:40:12.596102 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.596811 master-0 kubenswrapper[16352]: I0307 21:40:12.596770 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/f284e621-c075-47ba-9d59-f6491ac0698a-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:12.613045 master-0 kubenswrapper[16352]: I0307 21:40:12.612952 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dps65\" (UniqueName: \"kubernetes.io/projected/f284e621-c075-47ba-9d59-f6491ac0698a-kube-api-access-dps65\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:13.980946 master-0 kubenswrapper[16352]: I0307 21:40:13.980820 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 21:40:13.988088 master-0 kubenswrapper[16352]: I0307 21:40:13.988029 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:13.993855 master-0 kubenswrapper[16352]: I0307 21:40:13.993166 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 07 21:40:13.993855 master-0 kubenswrapper[16352]: I0307 21:40:13.993585 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 07 21:40:13.994130 master-0 kubenswrapper[16352]: I0307 21:40:13.993918 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 07 21:40:14.003146 master-0 kubenswrapper[16352]: I0307 21:40:14.003073 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 21:40:14.014571 master-0 kubenswrapper[16352]: I0307 21:40:14.014481 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-acc72ec2-5af2-41cb-8898-db335c63aa17\" (UniqueName: \"kubernetes.io/csi/topolvm.io^740beddd-2e2e-4111-b61d-e459bd199e7b\") pod \"ovsdbserver-nb-0\" (UID: \"f284e621-c075-47ba-9d59-f6491ac0698a\") " pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:14.129480 master-0 kubenswrapper[16352]: I0307 21:40:14.129292 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-efd6847b-7fd2-49ab-82e8-7e90acb0cc87\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3035db21-542a-4201-825f-f56af8ba2e49\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.129480 master-0 kubenswrapper[16352]: I0307 21:40:14.129399 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.129480 master-0 kubenswrapper[16352]: I0307 21:40:14.129462 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.129790 master-0 kubenswrapper[16352]: I0307 21:40:14.129564 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.129790 master-0 kubenswrapper[16352]: I0307 21:40:14.129591 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.130080 master-0 kubenswrapper[16352]: I0307 21:40:14.130018 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.130452 master-0 kubenswrapper[16352]: I0307 21:40:14.130362 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.130576 master-0 kubenswrapper[16352]: I0307 21:40:14.130548 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mzllf\" (UniqueName: \"kubernetes.io/projected/7c592859-6e1b-4c8c-afa7-9471e3991980-kube-api-access-mzllf\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.233541 master-0 kubenswrapper[16352]: I0307 21:40:14.233122 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.233874 master-0 kubenswrapper[16352]: I0307 21:40:14.233535 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.234258 master-0 kubenswrapper[16352]: I0307 21:40:14.233970 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.234342 master-0 kubenswrapper[16352]: I0307 21:40:14.234252 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.235106 master-0 kubenswrapper[16352]: I0307 21:40:14.234523 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-config\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.235106 master-0 kubenswrapper[16352]: I0307 21:40:14.234563 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mzllf\" (UniqueName: \"kubernetes.io/projected/7c592859-6e1b-4c8c-afa7-9471e3991980-kube-api-access-mzllf\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.235106 master-0 kubenswrapper[16352]: I0307 21:40:14.234848 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-efd6847b-7fd2-49ab-82e8-7e90acb0cc87\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3035db21-542a-4201-825f-f56af8ba2e49\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.235106 master-0 kubenswrapper[16352]: I0307 21:40:14.234892 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.235106 master-0 kubenswrapper[16352]: I0307 21:40:14.234949 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.236265 master-0 kubenswrapper[16352]: I0307 21:40:14.235773 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.236560 master-0 kubenswrapper[16352]: I0307 21:40:14.236505 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7c592859-6e1b-4c8c-afa7-9471e3991980-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.239073 master-0 kubenswrapper[16352]: I0307 21:40:14.238542 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:14.239073 master-0 kubenswrapper[16352]: I0307 21:40:14.238585 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-efd6847b-7fd2-49ab-82e8-7e90acb0cc87\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3035db21-542a-4201-825f-f56af8ba2e49\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/499478d081480c0909418955ef3010a0d42cb8e0484a36f5c71272f0ffc0cee2/globalmount\"" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.240188 master-0 kubenswrapper[16352]: I0307 21:40:14.240130 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.240188 master-0 kubenswrapper[16352]: I0307 21:40:14.240139 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.241219 master-0 kubenswrapper[16352]: I0307 21:40:14.240789 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c592859-6e1b-4c8c-afa7-9471e3991980-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.259749 master-0 kubenswrapper[16352]: I0307 21:40:14.259668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mzllf\" (UniqueName: \"kubernetes.io/projected/7c592859-6e1b-4c8c-afa7-9471e3991980-kube-api-access-mzllf\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:14.275349 master-0 kubenswrapper[16352]: I0307 21:40:14.274513 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:15.687382 master-0 kubenswrapper[16352]: I0307 21:40:15.687305 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-efd6847b-7fd2-49ab-82e8-7e90acb0cc87\" (UniqueName: \"kubernetes.io/csi/topolvm.io^3035db21-542a-4201-825f-f56af8ba2e49\") pod \"ovsdbserver-sb-0\" (UID: \"7c592859-6e1b-4c8c-afa7-9471e3991980\") " pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:15.821010 master-0 kubenswrapper[16352]: I0307 21:40:15.820939 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:19.334795 master-0 kubenswrapper[16352]: I0307 21:40:19.332430 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 07 21:40:19.354037 master-0 kubenswrapper[16352]: I0307 21:40:19.353955 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 07 21:40:19.368706 master-0 kubenswrapper[16352]: I0307 21:40:19.368631 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 07 21:40:19.419891 master-0 kubenswrapper[16352]: I0307 21:40:19.419825 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 07 21:40:19.452313 master-0 kubenswrapper[16352]: I0307 21:40:19.452213 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wptpb"] Mar 07 21:40:19.513167 master-0 kubenswrapper[16352]: I0307 21:40:19.513108 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 07 21:40:19.557610 master-0 kubenswrapper[16352]: I0307 21:40:19.557496 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 07 21:40:19.662319 master-0 kubenswrapper[16352]: I0307 21:40:19.662206 16352 generic.go:334] "Generic (PLEG): container finished" podID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerID="3f7dcbd8ee8c4e6ce6efda577f7a03da7bb52d8ae4f0377a93036487710d1905" exitCode=0 Mar 07 21:40:19.662423 master-0 kubenswrapper[16352]: I0307 21:40:19.662337 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" event={"ID":"af1ce916-8e43-4899-9c97-9aba5f6c5679","Type":"ContainerDied","Data":"3f7dcbd8ee8c4e6ce6efda577f7a03da7bb52d8ae4f0377a93036487710d1905"} Mar 07 21:40:19.664694 master-0 kubenswrapper[16352]: I0307 21:40:19.664599 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f284e621-c075-47ba-9d59-f6491ac0698a","Type":"ContainerStarted","Data":"2d6022464dd67e00cb394766163bea01d722b7d3be1760f2f82be390e56f9357"} Mar 07 21:40:19.666051 master-0 kubenswrapper[16352]: I0307 21:40:19.666001 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"90e6075f-5d67-4d68-a26b-621590c4ca33","Type":"ContainerStarted","Data":"b8d4b1ca5de287b4169259f1221bf0e83efd92c9140fa4979742352bb244a921"} Mar 07 21:40:19.667147 master-0 kubenswrapper[16352]: W0307 21:40:19.667043 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c592859_6e1b_4c8c_afa7_9471e3991980.slice/crio-1ec4487dfce50f5997e2496281fe7d1951deb02bbd2272d386b0a18fefc836b2 WatchSource:0}: Error finding container 1ec4487dfce50f5997e2496281fe7d1951deb02bbd2272d386b0a18fefc836b2: Status 404 returned error can't find the container with id 1ec4487dfce50f5997e2496281fe7d1951deb02bbd2272d386b0a18fefc836b2 Mar 07 21:40:19.668224 master-0 kubenswrapper[16352]: I0307 21:40:19.668166 16352 generic.go:334] "Generic (PLEG): container finished" podID="2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" containerID="bac06120d96ebeffe8c0d6c4b47ecff0b9246f3d14ad88082d9e91f9abbe428a" exitCode=0 Mar 07 21:40:19.668269 master-0 kubenswrapper[16352]: I0307 21:40:19.668217 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" event={"ID":"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c","Type":"ContainerDied","Data":"bac06120d96ebeffe8c0d6c4b47ecff0b9246f3d14ad88082d9e91f9abbe428a"} Mar 07 21:40:19.668464 master-0 kubenswrapper[16352]: I0307 21:40:19.668403 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 07 21:40:19.669595 master-0 kubenswrapper[16352]: I0307 21:40:19.669554 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f025883-7fbd-4887-9328-36ba8b9c326b","Type":"ContainerStarted","Data":"bc7515b1e0446fd3eb150a7daa8b679fca2002ec3a445f90e7676b2ddedb3bdf"} Mar 07 21:40:19.672244 master-0 kubenswrapper[16352]: I0307 21:40:19.671507 16352 generic.go:334] "Generic (PLEG): container finished" podID="36578f90-945d-4f93-ac4e-346ff30e9119" containerID="adae7b29722b5fef6af3c97e8a0de88eedae76aefaf6fb4c97e12081bee72677" exitCode=0 Mar 07 21:40:19.672244 master-0 kubenswrapper[16352]: I0307 21:40:19.671553 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-m4658" event={"ID":"36578f90-945d-4f93-ac4e-346ff30e9119","Type":"ContainerDied","Data":"adae7b29722b5fef6af3c97e8a0de88eedae76aefaf6fb4c97e12081bee72677"} Mar 07 21:40:19.675313 master-0 kubenswrapper[16352]: I0307 21:40:19.675281 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea79a140-1767-4d8d-b766-fd36a08926da","Type":"ContainerStarted","Data":"b62a55b80fbd51c36f7e0b1f8f6bab65de57732b945b1769f90b42e4e1bcf69e"} Mar 07 21:40:19.677322 master-0 kubenswrapper[16352]: I0307 21:40:19.677294 16352 generic.go:334] "Generic (PLEG): container finished" podID="4628d7f0-d710-446c-a574-db5c172ff74d" containerID="348241563d0c3147184682c4eb06857293d610dbe5c83937568836b089c64620" exitCode=0 Mar 07 21:40:19.677428 master-0 kubenswrapper[16352]: I0307 21:40:19.677394 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" event={"ID":"4628d7f0-d710-446c-a574-db5c172ff74d","Type":"ContainerDied","Data":"348241563d0c3147184682c4eb06857293d610dbe5c83937568836b089c64620"} Mar 07 21:40:19.680120 master-0 kubenswrapper[16352]: I0307 21:40:19.680086 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wptpb" event={"ID":"59890fa2-937c-4fc1-9f3d-6c2297a9d46b","Type":"ContainerStarted","Data":"84ddd82b4a0d1c8913e4907f22fd7a4ed79d1b3792f3b60e8bf4a9b7023c4475"} Mar 07 21:40:19.681361 master-0 kubenswrapper[16352]: I0307 21:40:19.681325 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4d2620a2-19d9-4543-922c-dc7951734958","Type":"ContainerStarted","Data":"4fbb5fddf74cbc1b87cd8b71bce6c0f59c5db52225034f35e2a6f332ff6d2ea2"} Mar 07 21:40:19.682723 master-0 kubenswrapper[16352]: I0307 21:40:19.682636 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1bc36cee-aa13-4fd2-873c-892c54978add","Type":"ContainerStarted","Data":"e71eca837677dba9030f2a5b96f337ffa23a4b53c668ae6618142de2772d9f5e"} Mar 07 21:40:20.412264 master-0 kubenswrapper[16352]: I0307 21:40:20.412203 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:40:20.416220 master-0 kubenswrapper[16352]: I0307 21:40:20.416157 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:40:20.512424 master-0 kubenswrapper[16352]: I0307 21:40:20.512363 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-csxfx"] Mar 07 21:40:20.555550 master-0 kubenswrapper[16352]: I0307 21:40:20.555478 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5754f\" (UniqueName: \"kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f\") pod \"4628d7f0-d710-446c-a574-db5c172ff74d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " Mar 07 21:40:20.555798 master-0 kubenswrapper[16352]: I0307 21:40:20.555601 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config\") pod \"4628d7f0-d710-446c-a574-db5c172ff74d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " Mar 07 21:40:20.556139 master-0 kubenswrapper[16352]: I0307 21:40:20.556108 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6vsl8\" (UniqueName: \"kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8\") pod \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " Mar 07 21:40:20.556319 master-0 kubenswrapper[16352]: I0307 21:40:20.556288 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc\") pod \"4628d7f0-d710-446c-a574-db5c172ff74d\" (UID: \"4628d7f0-d710-446c-a574-db5c172ff74d\") " Mar 07 21:40:20.556487 master-0 kubenswrapper[16352]: I0307 21:40:20.556462 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config\") pod \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\" (UID: \"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c\") " Mar 07 21:40:20.563067 master-0 kubenswrapper[16352]: I0307 21:40:20.562987 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8" (OuterVolumeSpecName: "kube-api-access-6vsl8") pod "2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" (UID: "2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c"). InnerVolumeSpecName "kube-api-access-6vsl8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:20.573514 master-0 kubenswrapper[16352]: I0307 21:40:20.572318 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f" (OuterVolumeSpecName: "kube-api-access-5754f") pod "4628d7f0-d710-446c-a574-db5c172ff74d" (UID: "4628d7f0-d710-446c-a574-db5c172ff74d"). InnerVolumeSpecName "kube-api-access-5754f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:20.589580 master-0 kubenswrapper[16352]: I0307 21:40:20.589499 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4628d7f0-d710-446c-a574-db5c172ff74d" (UID: "4628d7f0-d710-446c-a574-db5c172ff74d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:20.616985 master-0 kubenswrapper[16352]: I0307 21:40:20.613422 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config" (OuterVolumeSpecName: "config") pod "4628d7f0-d710-446c-a574-db5c172ff74d" (UID: "4628d7f0-d710-446c-a574-db5c172ff74d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.635377 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config" (OuterVolumeSpecName: "config") pod "2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" (UID: "2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.639625 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-h69l5"] Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: E0307 21:40:20.640145 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" containerName="init" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.640159 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" containerName="init" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: E0307 21:40:20.640173 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4628d7f0-d710-446c-a574-db5c172ff74d" containerName="init" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.640181 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="4628d7f0-d710-446c-a574-db5c172ff74d" containerName="init" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.640419 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="4628d7f0-d710-446c-a574-db5c172ff74d" containerName="init" Mar 07 21:40:20.640783 master-0 kubenswrapper[16352]: I0307 21:40:20.640446 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" containerName="init" Mar 07 21:40:20.641700 master-0 kubenswrapper[16352]: I0307 21:40:20.641645 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.648845 master-0 kubenswrapper[16352]: I0307 21:40:20.647782 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658702 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658807 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-combined-ca-bundle\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658864 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovn-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658892 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-config\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658953 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovs-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.658986 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hclv\" (UniqueName: \"kubernetes.io/projected/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-kube-api-access-9hclv\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.659058 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.659076 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5754f\" (UniqueName: \"kubernetes.io/projected/4628d7f0-d710-446c-a574-db5c172ff74d-kube-api-access-5754f\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.659088 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.659097 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6vsl8\" (UniqueName: \"kubernetes.io/projected/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c-kube-api-access-6vsl8\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:20.662476 master-0 kubenswrapper[16352]: I0307 21:40:20.659106 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4628d7f0-d710-446c-a574-db5c172ff74d-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:20.677370 master-0 kubenswrapper[16352]: I0307 21:40:20.674147 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h69l5"] Mar 07 21:40:20.769581 master-0 kubenswrapper[16352]: I0307 21:40:20.769500 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" event={"ID":"af1ce916-8e43-4899-9c97-9aba5f6c5679","Type":"ContainerStarted","Data":"856e6fab21192d51dfc9753d403ded467f75f1bdb4d0d3ab344b3aa7bc161f28"} Mar 07 21:40:20.770212 master-0 kubenswrapper[16352]: I0307 21:40:20.770150 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:40:20.773643 master-0 kubenswrapper[16352]: I0307 21:40:20.773580 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" event={"ID":"4628d7f0-d710-446c-a574-db5c172ff74d","Type":"ContainerDied","Data":"86ba46fbec68245c740964eaf76f25f41e726abd518476c7a7f0ed0ab0fefb9f"} Mar 07 21:40:20.773837 master-0 kubenswrapper[16352]: I0307 21:40:20.773615 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-667b9d65dc-vfb6d" Mar 07 21:40:20.773973 master-0 kubenswrapper[16352]: I0307 21:40:20.773818 16352 scope.go:117] "RemoveContainer" containerID="348241563d0c3147184682c4eb06857293d610dbe5c83937568836b089c64620" Mar 07 21:40:20.791779 master-0 kubenswrapper[16352]: I0307 21:40:20.782323 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-combined-ca-bundle\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.794756 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-combined-ca-bundle\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.787109 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovn-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.794952 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-config\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.795208 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" event={"ID":"2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c","Type":"ContainerDied","Data":"01a683f26d7d0c797e69c66dd05f9316c8cc8b73f17f468bda8553d1e13da8a7"} Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.795226 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovs-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.795264 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovn-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.795436 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-ovs-rundir\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.798462 master-0 kubenswrapper[16352]: I0307 21:40:20.795516 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-69fd45f56f-msd9g" Mar 07 21:40:20.806907 master-0 kubenswrapper[16352]: I0307 21:40:20.800225 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hclv\" (UniqueName: \"kubernetes.io/projected/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-kube-api-access-9hclv\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.806907 master-0 kubenswrapper[16352]: I0307 21:40:20.800499 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.806907 master-0 kubenswrapper[16352]: I0307 21:40:20.802269 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-config\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.806907 master-0 kubenswrapper[16352]: I0307 21:40:20.802791 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csxfx" event={"ID":"cd69bc9d-5b53-4868-bfab-b3956e54600d","Type":"ContainerStarted","Data":"a8bf5e72584d4100cb73f2d9bf1d7e4ac8a095ad9bb47f14cfea381559ae5385"} Mar 07 21:40:20.811229 master-0 kubenswrapper[16352]: I0307 21:40:20.811182 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.828240 master-0 kubenswrapper[16352]: I0307 21:40:20.827698 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c592859-6e1b-4c8c-afa7-9471e3991980","Type":"ContainerStarted","Data":"1ec4487dfce50f5997e2496281fe7d1951deb02bbd2272d386b0a18fefc836b2"} Mar 07 21:40:20.829341 master-0 kubenswrapper[16352]: I0307 21:40:20.829221 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" podStartSLOduration=3.739282377 podStartE2EDuration="21.829184423s" podCreationTimestamp="2026-03-07 21:39:59 +0000 UTC" firstStartedPulling="2026-03-07 21:40:00.66329326 +0000 UTC m=+1323.733998319" lastFinishedPulling="2026-03-07 21:40:18.753195306 +0000 UTC m=+1341.823900365" observedRunningTime="2026-03-07 21:40:20.80072325 +0000 UTC m=+1343.871428309" watchObservedRunningTime="2026-03-07 21:40:20.829184423 +0000 UTC m=+1343.899889482" Mar 07 21:40:20.832332 master-0 kubenswrapper[16352]: I0307 21:40:20.832289 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hclv\" (UniqueName: \"kubernetes.io/projected/32a23306-ab4d-4ba6-afb9-90e7b23d0bed-kube-api-access-9hclv\") pod \"ovn-controller-metrics-h69l5\" (UID: \"32a23306-ab4d-4ba6-afb9-90e7b23d0bed\") " pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.847742 master-0 kubenswrapper[16352]: I0307 21:40:20.847216 16352 scope.go:117] "RemoveContainer" containerID="bac06120d96ebeffe8c0d6c4b47ecff0b9246f3d14ad88082d9e91f9abbe428a" Mar 07 21:40:20.895431 master-0 kubenswrapper[16352]: I0307 21:40:20.873184 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-m4658" event={"ID":"36578f90-945d-4f93-ac4e-346ff30e9119","Type":"ContainerStarted","Data":"c6d15a064ea78caea81c8ec9ca416c119725ae51c0f8be5c808ab19093f7c916"} Mar 07 21:40:20.895431 master-0 kubenswrapper[16352]: I0307 21:40:20.873473 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:40:20.895431 master-0 kubenswrapper[16352]: E0307 21:40:20.889388 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4628d7f0_d710_446c_a574_db5c172ff74d.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:40:20.933337 master-0 kubenswrapper[16352]: I0307 21:40:20.923745 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:40:20.972710 master-0 kubenswrapper[16352]: I0307 21:40:20.968331 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-h69l5" Mar 07 21:40:20.996712 master-0 kubenswrapper[16352]: I0307 21:40:20.992718 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:20.996712 master-0 kubenswrapper[16352]: I0307 21:40:20.995586 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.010709 master-0 kubenswrapper[16352]: I0307 21:40:21.003059 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 07 21:40:21.016706 master-0 kubenswrapper[16352]: I0307 21:40:21.014152 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:40:21.147223 master-0 kubenswrapper[16352]: I0307 21:40:21.122730 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.147223 master-0 kubenswrapper[16352]: I0307 21:40:21.122901 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.147223 master-0 kubenswrapper[16352]: I0307 21:40:21.122977 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.147223 master-0 kubenswrapper[16352]: I0307 21:40:21.123030 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lhn8\" (UniqueName: \"kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.147223 master-0 kubenswrapper[16352]: I0307 21:40:21.139252 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-667b9d65dc-vfb6d"] Mar 07 21:40:21.184976 master-0 kubenswrapper[16352]: I0307 21:40:21.184428 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:21.187812 master-0 kubenswrapper[16352]: I0307 21:40:21.187651 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7466868675-m4658" podStartSLOduration=3.778969198 podStartE2EDuration="22.187593669s" podCreationTimestamp="2026-03-07 21:39:59 +0000 UTC" firstStartedPulling="2026-03-07 21:40:00.290801176 +0000 UTC m=+1323.361506235" lastFinishedPulling="2026-03-07 21:40:18.699425647 +0000 UTC m=+1341.770130706" observedRunningTime="2026-03-07 21:40:20.937667231 +0000 UTC m=+1344.008372290" watchObservedRunningTime="2026-03-07 21:40:21.187593669 +0000 UTC m=+1344.258298728" Mar 07 21:40:21.230119 master-0 kubenswrapper[16352]: I0307 21:40:21.230028 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4628d7f0-d710-446c-a574-db5c172ff74d" path="/var/lib/kubelet/pods/4628d7f0-d710-446c-a574-db5c172ff74d/volumes" Mar 07 21:40:21.236391 master-0 kubenswrapper[16352]: I0307 21:40:21.234205 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:40:21.239071 master-0 kubenswrapper[16352]: I0307 21:40:21.239019 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.239505 master-0 kubenswrapper[16352]: I0307 21:40:21.239250 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.239629 master-0 kubenswrapper[16352]: I0307 21:40:21.239600 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lhn8\" (UniqueName: \"kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.239726 master-0 kubenswrapper[16352]: I0307 21:40:21.239705 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.246749 master-0 kubenswrapper[16352]: I0307 21:40:21.245408 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-69fd45f56f-msd9g"] Mar 07 21:40:21.246749 master-0 kubenswrapper[16352]: I0307 21:40:21.246317 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.247048 master-0 kubenswrapper[16352]: I0307 21:40:21.246958 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.247928 master-0 kubenswrapper[16352]: I0307 21:40:21.247895 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.281234 master-0 kubenswrapper[16352]: I0307 21:40:21.281142 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:40:21.308227 master-0 kubenswrapper[16352]: I0307 21:40:21.307998 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:21.313455 master-0 kubenswrapper[16352]: I0307 21:40:21.313412 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.318095 master-0 kubenswrapper[16352]: I0307 21:40:21.317484 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lhn8\" (UniqueName: \"kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8\") pod \"dnsmasq-dns-7f654db4c5-5b5lg\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.318095 master-0 kubenswrapper[16352]: I0307 21:40:21.317553 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 07 21:40:21.340261 master-0 kubenswrapper[16352]: I0307 21:40:21.336318 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:21.351527 master-0 kubenswrapper[16352]: I0307 21:40:21.351471 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:21.451513 master-0 kubenswrapper[16352]: I0307 21:40:21.451287 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ck4sg\" (UniqueName: \"kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.453864 master-0 kubenswrapper[16352]: I0307 21:40:21.451550 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.453864 master-0 kubenswrapper[16352]: I0307 21:40:21.451917 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.453864 master-0 kubenswrapper[16352]: I0307 21:40:21.452202 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.453864 master-0 kubenswrapper[16352]: I0307 21:40:21.452355 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.556645 master-0 kubenswrapper[16352]: I0307 21:40:21.556525 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ck4sg\" (UniqueName: \"kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.556645 master-0 kubenswrapper[16352]: I0307 21:40:21.556699 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.557124 master-0 kubenswrapper[16352]: I0307 21:40:21.556751 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.557124 master-0 kubenswrapper[16352]: I0307 21:40:21.556798 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.557124 master-0 kubenswrapper[16352]: I0307 21:40:21.556878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.561268 master-0 kubenswrapper[16352]: I0307 21:40:21.558375 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.561268 master-0 kubenswrapper[16352]: I0307 21:40:21.559468 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.561268 master-0 kubenswrapper[16352]: I0307 21:40:21.561111 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.565811 master-0 kubenswrapper[16352]: I0307 21:40:21.563790 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.608783 master-0 kubenswrapper[16352]: I0307 21:40:21.606769 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ck4sg\" (UniqueName: \"kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg\") pod \"dnsmasq-dns-58dc6c9559-pt84w\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.672798 master-0 kubenswrapper[16352]: I0307 21:40:21.672469 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-h69l5"] Mar 07 21:40:21.771928 master-0 kubenswrapper[16352]: I0307 21:40:21.771764 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:21.897036 master-0 kubenswrapper[16352]: I0307 21:40:21.894173 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h69l5" event={"ID":"32a23306-ab4d-4ba6-afb9-90e7b23d0bed","Type":"ContainerStarted","Data":"dc4f77a40e206f9e3904176b2c2e022a080a71e8510a25169121b01def356674"} Mar 07 21:40:21.995407 master-0 kubenswrapper[16352]: I0307 21:40:21.995340 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:22.377460 master-0 kubenswrapper[16352]: W0307 21:40:22.377390 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod167747ff_9ac4_42be_9894_549a51404415.slice/crio-62c1d194d1d62073a66561eaabbdd97e0272bdad0ec866f0d1f06d74209fbae3 WatchSource:0}: Error finding container 62c1d194d1d62073a66561eaabbdd97e0272bdad0ec866f0d1f06d74209fbae3: Status 404 returned error can't find the container with id 62c1d194d1d62073a66561eaabbdd97e0272bdad0ec866f0d1f06d74209fbae3 Mar 07 21:40:22.377460 master-0 kubenswrapper[16352]: I0307 21:40:22.377401 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:22.952262 master-0 kubenswrapper[16352]: I0307 21:40:22.952085 16352 generic.go:334] "Generic (PLEG): container finished" podID="09adeffb-21b4-4651-910f-1588cd295a27" containerID="b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab" exitCode=0 Mar 07 21:40:22.952262 master-0 kubenswrapper[16352]: I0307 21:40:22.952212 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" event={"ID":"09adeffb-21b4-4651-910f-1588cd295a27","Type":"ContainerDied","Data":"b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab"} Mar 07 21:40:22.952262 master-0 kubenswrapper[16352]: I0307 21:40:22.952254 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" event={"ID":"09adeffb-21b4-4651-910f-1588cd295a27","Type":"ContainerStarted","Data":"8cab9eb2189062d2cefafa7e77192516e96ee4e3cb162308aec8b314aea62ea0"} Mar 07 21:40:22.956969 master-0 kubenswrapper[16352]: I0307 21:40:22.956912 16352 generic.go:334] "Generic (PLEG): container finished" podID="167747ff-9ac4-42be-9894-549a51404415" containerID="235e1853e357a4236ad9c348a09263f93caab1a6f301f445d9ec7cdd6ed4d86d" exitCode=0 Mar 07 21:40:22.957224 master-0 kubenswrapper[16352]: I0307 21:40:22.956982 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" event={"ID":"167747ff-9ac4-42be-9894-549a51404415","Type":"ContainerDied","Data":"235e1853e357a4236ad9c348a09263f93caab1a6f301f445d9ec7cdd6ed4d86d"} Mar 07 21:40:22.957295 master-0 kubenswrapper[16352]: I0307 21:40:22.957252 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" event={"ID":"167747ff-9ac4-42be-9894-549a51404415","Type":"ContainerStarted","Data":"62c1d194d1d62073a66561eaabbdd97e0272bdad0ec866f0d1f06d74209fbae3"} Mar 07 21:40:22.957295 master-0 kubenswrapper[16352]: I0307 21:40:22.957159 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7466868675-m4658" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" containerID="cri-o://c6d15a064ea78caea81c8ec9ca416c119725ae51c0f8be5c808ab19093f7c916" gracePeriod=10 Mar 07 21:40:22.958418 master-0 kubenswrapper[16352]: I0307 21:40:22.958394 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" containerID="cri-o://856e6fab21192d51dfc9753d403ded467f75f1bdb4d0d3ab344b3aa7bc161f28" gracePeriod=10 Mar 07 21:40:23.246621 master-0 kubenswrapper[16352]: I0307 21:40:23.246461 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c" path="/var/lib/kubelet/pods/2c8bc6cc-9eef-42fa-bfe0-9d2a5d61a96c/volumes" Mar 07 21:40:23.974302 master-0 kubenswrapper[16352]: I0307 21:40:23.974236 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" event={"ID":"167747ff-9ac4-42be-9894-549a51404415","Type":"ContainerStarted","Data":"8d8f4443df157eeb09f76d18567cf29044279a3cac8b9cf89f2f0cdabcf68049"} Mar 07 21:40:23.974939 master-0 kubenswrapper[16352]: I0307 21:40:23.974461 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:23.980833 master-0 kubenswrapper[16352]: I0307 21:40:23.980797 16352 generic.go:334] "Generic (PLEG): container finished" podID="36578f90-945d-4f93-ac4e-346ff30e9119" containerID="c6d15a064ea78caea81c8ec9ca416c119725ae51c0f8be5c808ab19093f7c916" exitCode=0 Mar 07 21:40:23.980971 master-0 kubenswrapper[16352]: I0307 21:40:23.980888 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-m4658" event={"ID":"36578f90-945d-4f93-ac4e-346ff30e9119","Type":"ContainerDied","Data":"c6d15a064ea78caea81c8ec9ca416c119725ae51c0f8be5c808ab19093f7c916"} Mar 07 21:40:23.984603 master-0 kubenswrapper[16352]: I0307 21:40:23.984378 16352 generic.go:334] "Generic (PLEG): container finished" podID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerID="856e6fab21192d51dfc9753d403ded467f75f1bdb4d0d3ab344b3aa7bc161f28" exitCode=0 Mar 07 21:40:23.984603 master-0 kubenswrapper[16352]: I0307 21:40:23.984548 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" event={"ID":"af1ce916-8e43-4899-9c97-9aba5f6c5679","Type":"ContainerDied","Data":"856e6fab21192d51dfc9753d403ded467f75f1bdb4d0d3ab344b3aa7bc161f28"} Mar 07 21:40:23.989194 master-0 kubenswrapper[16352]: I0307 21:40:23.989157 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" event={"ID":"09adeffb-21b4-4651-910f-1588cd295a27","Type":"ContainerStarted","Data":"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48"} Mar 07 21:40:23.989447 master-0 kubenswrapper[16352]: I0307 21:40:23.989423 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:24.020211 master-0 kubenswrapper[16352]: I0307 21:40:24.020055 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" podStartSLOduration=3.020020327 podStartE2EDuration="3.020020327s" podCreationTimestamp="2026-03-07 21:40:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:40:24.000699975 +0000 UTC m=+1347.071405044" watchObservedRunningTime="2026-03-07 21:40:24.020020327 +0000 UTC m=+1347.090725426" Mar 07 21:40:24.037066 master-0 kubenswrapper[16352]: I0307 21:40:24.036963 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" podStartSLOduration=4.036936903 podStartE2EDuration="4.036936903s" podCreationTimestamp="2026-03-07 21:40:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:40:24.024907015 +0000 UTC m=+1347.095612064" watchObservedRunningTime="2026-03-07 21:40:24.036936903 +0000 UTC m=+1347.107641972" Mar 07 21:40:29.648942 master-0 kubenswrapper[16352]: I0307 21:40:29.648740 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7466868675-m4658" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.165:5353: i/o timeout" Mar 07 21:40:30.002479 master-0 kubenswrapper[16352]: I0307 21:40:30.002251 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.166:5353: i/o timeout" Mar 07 21:40:31.327833 master-0 kubenswrapper[16352]: I0307 21:40:31.327211 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:40:31.333558 master-0 kubenswrapper[16352]: I0307 21:40:31.333508 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:40:31.347917 master-0 kubenswrapper[16352]: I0307 21:40:31.347862 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383101 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc\") pod \"36578f90-945d-4f93-ac4e-346ff30e9119\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383219 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc\") pod \"af1ce916-8e43-4899-9c97-9aba5f6c5679\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383439 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config\") pod \"af1ce916-8e43-4899-9c97-9aba5f6c5679\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383601 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config\") pod \"36578f90-945d-4f93-ac4e-346ff30e9119\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383702 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zp82f\" (UniqueName: \"kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f\") pod \"36578f90-945d-4f93-ac4e-346ff30e9119\" (UID: \"36578f90-945d-4f93-ac4e-346ff30e9119\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.383982 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gm9bj\" (UniqueName: \"kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj\") pod \"af1ce916-8e43-4899-9c97-9aba5f6c5679\" (UID: \"af1ce916-8e43-4899-9c97-9aba5f6c5679\") " Mar 07 21:40:31.388782 master-0 kubenswrapper[16352]: I0307 21:40:31.388457 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f" (OuterVolumeSpecName: "kube-api-access-zp82f") pod "36578f90-945d-4f93-ac4e-346ff30e9119" (UID: "36578f90-945d-4f93-ac4e-346ff30e9119"). InnerVolumeSpecName "kube-api-access-zp82f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:31.399301 master-0 kubenswrapper[16352]: I0307 21:40:31.399245 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj" (OuterVolumeSpecName: "kube-api-access-gm9bj") pod "af1ce916-8e43-4899-9c97-9aba5f6c5679" (UID: "af1ce916-8e43-4899-9c97-9aba5f6c5679"). InnerVolumeSpecName "kube-api-access-gm9bj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:31.434314 master-0 kubenswrapper[16352]: I0307 21:40:31.434241 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config" (OuterVolumeSpecName: "config") pod "af1ce916-8e43-4899-9c97-9aba5f6c5679" (UID: "af1ce916-8e43-4899-9c97-9aba5f6c5679"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:31.437574 master-0 kubenswrapper[16352]: I0307 21:40:31.437545 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "36578f90-945d-4f93-ac4e-346ff30e9119" (UID: "36578f90-945d-4f93-ac4e-346ff30e9119"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:31.440827 master-0 kubenswrapper[16352]: I0307 21:40:31.440744 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "af1ce916-8e43-4899-9c97-9aba5f6c5679" (UID: "af1ce916-8e43-4899-9c97-9aba5f6c5679"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:31.447589 master-0 kubenswrapper[16352]: I0307 21:40:31.447550 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config" (OuterVolumeSpecName: "config") pod "36578f90-945d-4f93-ac4e-346ff30e9119" (UID: "36578f90-945d-4f93-ac4e-346ff30e9119"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:31.489410 master-0 kubenswrapper[16352]: I0307 21:40:31.489324 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.489410 master-0 kubenswrapper[16352]: I0307 21:40:31.489400 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.489410 master-0 kubenswrapper[16352]: I0307 21:40:31.489412 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/af1ce916-8e43-4899-9c97-9aba5f6c5679-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.489596 master-0 kubenswrapper[16352]: I0307 21:40:31.489422 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/36578f90-945d-4f93-ac4e-346ff30e9119-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.489596 master-0 kubenswrapper[16352]: I0307 21:40:31.489437 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zp82f\" (UniqueName: \"kubernetes.io/projected/36578f90-945d-4f93-ac4e-346ff30e9119-kube-api-access-zp82f\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.489596 master-0 kubenswrapper[16352]: I0307 21:40:31.489448 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gm9bj\" (UniqueName: \"kubernetes.io/projected/af1ce916-8e43-4899-9c97-9aba5f6c5679-kube-api-access-gm9bj\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:31.774182 master-0 kubenswrapper[16352]: I0307 21:40:31.773961 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:31.886403 master-0 kubenswrapper[16352]: I0307 21:40:31.882775 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:32.146915 master-0 kubenswrapper[16352]: I0307 21:40:32.146728 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7466868675-m4658" Mar 07 21:40:32.146915 master-0 kubenswrapper[16352]: I0307 21:40:32.146728 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7466868675-m4658" event={"ID":"36578f90-945d-4f93-ac4e-346ff30e9119","Type":"ContainerDied","Data":"049dc9ea02e55a40510288684d6488f5903a5720f0d275ce610ebf2ce135b134"} Mar 07 21:40:32.147421 master-0 kubenswrapper[16352]: I0307 21:40:32.146938 16352 scope.go:117] "RemoveContainer" containerID="c6d15a064ea78caea81c8ec9ca416c119725ae51c0f8be5c808ab19093f7c916" Mar 07 21:40:32.151922 master-0 kubenswrapper[16352]: I0307 21:40:32.151848 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="dnsmasq-dns" containerID="cri-o://0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48" gracePeriod=10 Mar 07 21:40:32.152325 master-0 kubenswrapper[16352]: I0307 21:40:32.152184 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" event={"ID":"af1ce916-8e43-4899-9c97-9aba5f6c5679","Type":"ContainerDied","Data":"4f0ed74801411574e69742a8963bfcf22ec3d9f72fb4577f499a464241086b59"} Mar 07 21:40:32.152325 master-0 kubenswrapper[16352]: I0307 21:40:32.152235 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" Mar 07 21:40:32.195573 master-0 kubenswrapper[16352]: I0307 21:40:32.195498 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:40:32.218881 master-0 kubenswrapper[16352]: I0307 21:40:32.218791 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7466868675-m4658"] Mar 07 21:40:32.288536 master-0 kubenswrapper[16352]: I0307 21:40:32.288468 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:40:32.307234 master-0 kubenswrapper[16352]: I0307 21:40:32.307168 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-76ff7d945-qtbgb"] Mar 07 21:40:32.459256 master-0 kubenswrapper[16352]: E0307 21:40:32.459176 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaf1ce916_8e43_4899_9c97_9aba5f6c5679.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:40:33.215246 master-0 kubenswrapper[16352]: I0307 21:40:33.209244 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" path="/var/lib/kubelet/pods/36578f90-945d-4f93-ac4e-346ff30e9119/volumes" Mar 07 21:40:33.215246 master-0 kubenswrapper[16352]: I0307 21:40:33.210380 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" path="/var/lib/kubelet/pods/af1ce916-8e43-4899-9c97-9aba5f6c5679/volumes" Mar 07 21:40:33.598057 master-0 kubenswrapper[16352]: I0307 21:40:33.597125 16352 scope.go:117] "RemoveContainer" containerID="adae7b29722b5fef6af3c97e8a0de88eedae76aefaf6fb4c97e12081bee72677" Mar 07 21:40:34.010644 master-0 kubenswrapper[16352]: I0307 21:40:34.010598 16352 scope.go:117] "RemoveContainer" containerID="856e6fab21192d51dfc9753d403ded467f75f1bdb4d0d3ab344b3aa7bc161f28" Mar 07 21:40:34.077259 master-0 kubenswrapper[16352]: I0307 21:40:34.077216 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:34.096803 master-0 kubenswrapper[16352]: I0307 21:40:34.096671 16352 scope.go:117] "RemoveContainer" containerID="3f7dcbd8ee8c4e6ce6efda577f7a03da7bb52d8ae4f0377a93036487710d1905" Mar 07 21:40:34.196308 master-0 kubenswrapper[16352]: I0307 21:40:34.196210 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config\") pod \"09adeffb-21b4-4651-910f-1588cd295a27\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " Mar 07 21:40:34.196836 master-0 kubenswrapper[16352]: I0307 21:40:34.196749 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb\") pod \"09adeffb-21b4-4651-910f-1588cd295a27\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " Mar 07 21:40:34.197165 master-0 kubenswrapper[16352]: I0307 21:40:34.197111 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc\") pod \"09adeffb-21b4-4651-910f-1588cd295a27\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " Mar 07 21:40:34.197235 master-0 kubenswrapper[16352]: I0307 21:40:34.197182 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lhn8\" (UniqueName: \"kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8\") pod \"09adeffb-21b4-4651-910f-1588cd295a27\" (UID: \"09adeffb-21b4-4651-910f-1588cd295a27\") " Mar 07 21:40:34.200088 master-0 kubenswrapper[16352]: I0307 21:40:34.200030 16352 generic.go:334] "Generic (PLEG): container finished" podID="09adeffb-21b4-4651-910f-1588cd295a27" containerID="0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48" exitCode=0 Mar 07 21:40:34.200201 master-0 kubenswrapper[16352]: I0307 21:40:34.200172 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" event={"ID":"09adeffb-21b4-4651-910f-1588cd295a27","Type":"ContainerDied","Data":"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48"} Mar 07 21:40:34.200258 master-0 kubenswrapper[16352]: I0307 21:40:34.200221 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" event={"ID":"09adeffb-21b4-4651-910f-1588cd295a27","Type":"ContainerDied","Data":"8cab9eb2189062d2cefafa7e77192516e96ee4e3cb162308aec8b314aea62ea0"} Mar 07 21:40:34.200304 master-0 kubenswrapper[16352]: I0307 21:40:34.200230 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7f654db4c5-5b5lg" Mar 07 21:40:34.200437 master-0 kubenswrapper[16352]: I0307 21:40:34.200249 16352 scope.go:117] "RemoveContainer" containerID="0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48" Mar 07 21:40:34.216504 master-0 kubenswrapper[16352]: I0307 21:40:34.216441 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8" (OuterVolumeSpecName: "kube-api-access-6lhn8") pod "09adeffb-21b4-4651-910f-1588cd295a27" (UID: "09adeffb-21b4-4651-910f-1588cd295a27"). InnerVolumeSpecName "kube-api-access-6lhn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:34.272745 master-0 kubenswrapper[16352]: I0307 21:40:34.272671 16352 scope.go:117] "RemoveContainer" containerID="b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab" Mar 07 21:40:34.299857 master-0 kubenswrapper[16352]: I0307 21:40:34.299805 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lhn8\" (UniqueName: \"kubernetes.io/projected/09adeffb-21b4-4651-910f-1588cd295a27-kube-api-access-6lhn8\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:34.319824 master-0 kubenswrapper[16352]: I0307 21:40:34.319737 16352 scope.go:117] "RemoveContainer" containerID="0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48" Mar 07 21:40:34.320673 master-0 kubenswrapper[16352]: I0307 21:40:34.320043 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "09adeffb-21b4-4651-910f-1588cd295a27" (UID: "09adeffb-21b4-4651-910f-1588cd295a27"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:34.320673 master-0 kubenswrapper[16352]: I0307 21:40:34.320604 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "09adeffb-21b4-4651-910f-1588cd295a27" (UID: "09adeffb-21b4-4651-910f-1588cd295a27"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:34.323712 master-0 kubenswrapper[16352]: E0307 21:40:34.323655 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48\": container with ID starting with 0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48 not found: ID does not exist" containerID="0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48" Mar 07 21:40:34.323787 master-0 kubenswrapper[16352]: I0307 21:40:34.323732 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48"} err="failed to get container status \"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48\": rpc error: code = NotFound desc = could not find container \"0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48\": container with ID starting with 0afc125f81c0c8f708ee3b5ace7c414d4d172a448690bb4d6a8cfbbcae72aa48 not found: ID does not exist" Mar 07 21:40:34.323787 master-0 kubenswrapper[16352]: I0307 21:40:34.323775 16352 scope.go:117] "RemoveContainer" containerID="b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab" Mar 07 21:40:34.325073 master-0 kubenswrapper[16352]: E0307 21:40:34.324756 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab\": container with ID starting with b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab not found: ID does not exist" containerID="b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab" Mar 07 21:40:34.325073 master-0 kubenswrapper[16352]: I0307 21:40:34.324833 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab"} err="failed to get container status \"b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab\": rpc error: code = NotFound desc = could not find container \"b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab\": container with ID starting with b6effa6f8fcb87de9f6f8d3b4fd40270633b414e1e106ddcf8647f56cc38d6ab not found: ID does not exist" Mar 07 21:40:34.325768 master-0 kubenswrapper[16352]: I0307 21:40:34.325738 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config" (OuterVolumeSpecName: "config") pod "09adeffb-21b4-4651-910f-1588cd295a27" (UID: "09adeffb-21b4-4651-910f-1588cd295a27"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:34.402542 master-0 kubenswrapper[16352]: I0307 21:40:34.402394 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:34.402542 master-0 kubenswrapper[16352]: I0307 21:40:34.402464 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:34.402542 master-0 kubenswrapper[16352]: I0307 21:40:34.402482 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/09adeffb-21b4-4651-910f-1588cd295a27-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:34.548547 master-0 kubenswrapper[16352]: I0307 21:40:34.547964 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:34.573474 master-0 kubenswrapper[16352]: I0307 21:40:34.573404 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7f654db4c5-5b5lg"] Mar 07 21:40:34.649637 master-0 kubenswrapper[16352]: I0307 21:40:34.649395 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7466868675-m4658" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.165:5353: i/o timeout" Mar 07 21:40:35.003359 master-0 kubenswrapper[16352]: I0307 21:40:35.003123 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-76ff7d945-qtbgb" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.166:5353: i/o timeout" Mar 07 21:40:35.203933 master-0 kubenswrapper[16352]: I0307 21:40:35.203866 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09adeffb-21b4-4651-910f-1588cd295a27" path="/var/lib/kubelet/pods/09adeffb-21b4-4651-910f-1588cd295a27/volumes" Mar 07 21:40:35.235396 master-0 kubenswrapper[16352]: I0307 21:40:35.235334 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csxfx" event={"ID":"cd69bc9d-5b53-4868-bfab-b3956e54600d","Type":"ContainerStarted","Data":"85332f4c00e25eca142e081add8fa0de8e2de75a0ef0fe57173622ea04f88681"} Mar 07 21:40:35.243173 master-0 kubenswrapper[16352]: I0307 21:40:35.243086 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c592859-6e1b-4c8c-afa7-9471e3991980","Type":"ContainerStarted","Data":"ea5a1d5feb5b1d6d4ee923f3a97acd8a41de7d3531410afa2e4529f8ce4232b0"} Mar 07 21:40:35.248552 master-0 kubenswrapper[16352]: I0307 21:40:35.248505 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f284e621-c075-47ba-9d59-f6491ac0698a","Type":"ContainerStarted","Data":"666bf9db4ad15523698468145cf0d18b7cef0d58f1b58476f5158eb1e15bced9"} Mar 07 21:40:35.256244 master-0 kubenswrapper[16352]: I0307 21:40:35.256170 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"90e6075f-5d67-4d68-a26b-621590c4ca33","Type":"ContainerStarted","Data":"5f5f501252082b7a7e4ff3ba03a636a2e7bd57c88f5aa9e35836b543747836b2"} Mar 07 21:40:36.273536 master-0 kubenswrapper[16352]: I0307 21:40:36.273373 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"7c592859-6e1b-4c8c-afa7-9471e3991980","Type":"ContainerStarted","Data":"2f3ece3c19f1e477533c9fa1d122100f969a56639cc5e5853efa16021eddc000"} Mar 07 21:40:36.276862 master-0 kubenswrapper[16352]: I0307 21:40:36.276795 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"f284e621-c075-47ba-9d59-f6491ac0698a","Type":"ContainerStarted","Data":"88068b41b026a2cf90b0a11c12466193ced67346ad2d9ed49b35efcc968b7d3e"} Mar 07 21:40:36.279217 master-0 kubenswrapper[16352]: I0307 21:40:36.279172 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1bc36cee-aa13-4fd2-873c-892c54978add","Type":"ContainerStarted","Data":"000ace5bf1eafbe043854f20adb830ad238e2205964aa3e41e7fd937759ab021"} Mar 07 21:40:36.282580 master-0 kubenswrapper[16352]: I0307 21:40:36.282530 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"ea79a140-1767-4d8d-b766-fd36a08926da","Type":"ContainerStarted","Data":"958a87449804f4698492dc5e3e8becc9fddfa9d6530c96398f95885f5eb81019"} Mar 07 21:40:36.282762 master-0 kubenswrapper[16352]: I0307 21:40:36.282733 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 07 21:40:36.285087 master-0 kubenswrapper[16352]: I0307 21:40:36.284794 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-h69l5" event={"ID":"32a23306-ab4d-4ba6-afb9-90e7b23d0bed","Type":"ContainerStarted","Data":"182fe22c709f6dc9fb1c90084b867a320b589ab13f43f21f78d3fb9bef6aa638"} Mar 07 21:40:36.289628 master-0 kubenswrapper[16352]: I0307 21:40:36.289571 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wptpb" event={"ID":"59890fa2-937c-4fc1-9f3d-6c2297a9d46b","Type":"ContainerStarted","Data":"3d5b5fbf7a0e78f8b4e456aebb1a32db4a640d8d76abf1018c050a49fb0d7021"} Mar 07 21:40:36.292811 master-0 kubenswrapper[16352]: I0307 21:40:36.292731 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f025883-7fbd-4887-9328-36ba8b9c326b","Type":"ContainerStarted","Data":"430f1bf4eec06fab444c5588005d35eb7be06654e63c8278b5926d996bb92429"} Mar 07 21:40:36.294508 master-0 kubenswrapper[16352]: I0307 21:40:36.294440 16352 generic.go:334] "Generic (PLEG): container finished" podID="cd69bc9d-5b53-4868-bfab-b3956e54600d" containerID="85332f4c00e25eca142e081add8fa0de8e2de75a0ef0fe57173622ea04f88681" exitCode=0 Mar 07 21:40:36.294625 master-0 kubenswrapper[16352]: I0307 21:40:36.294504 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csxfx" event={"ID":"cd69bc9d-5b53-4868-bfab-b3956e54600d","Type":"ContainerDied","Data":"85332f4c00e25eca142e081add8fa0de8e2de75a0ef0fe57173622ea04f88681"} Mar 07 21:40:36.364836 master-0 kubenswrapper[16352]: I0307 21:40:36.364653 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=11.436754335 podStartE2EDuration="25.364616078s" podCreationTimestamp="2026-03-07 21:40:11 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.669632291 +0000 UTC m=+1342.740337350" lastFinishedPulling="2026-03-07 21:40:33.597494024 +0000 UTC m=+1356.668199093" observedRunningTime="2026-03-07 21:40:36.357455166 +0000 UTC m=+1359.428160235" watchObservedRunningTime="2026-03-07 21:40:36.364616078 +0000 UTC m=+1359.435321177" Mar 07 21:40:36.601305 master-0 kubenswrapper[16352]: I0307 21:40:36.601120 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.131631607 podStartE2EDuration="26.601098305s" podCreationTimestamp="2026-03-07 21:40:10 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.544704838 +0000 UTC m=+1342.615409897" lastFinishedPulling="2026-03-07 21:40:34.014171536 +0000 UTC m=+1357.084876595" observedRunningTime="2026-03-07 21:40:36.593286417 +0000 UTC m=+1359.663991476" watchObservedRunningTime="2026-03-07 21:40:36.601098305 +0000 UTC m=+1359.671803364" Mar 07 21:40:36.805342 master-0 kubenswrapper[16352]: I0307 21:40:36.805173 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-wptpb" Mar 07 21:40:36.821398 master-0 kubenswrapper[16352]: I0307 21:40:36.821318 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:36.981534 master-0 kubenswrapper[16352]: I0307 21:40:36.981409 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=19.360554908 podStartE2EDuration="33.981376273s" podCreationTimestamp="2026-03-07 21:40:03 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.428925864 +0000 UTC m=+1342.499630923" lastFinishedPulling="2026-03-07 21:40:34.049747209 +0000 UTC m=+1357.120452288" observedRunningTime="2026-03-07 21:40:36.972796428 +0000 UTC m=+1360.043501527" watchObservedRunningTime="2026-03-07 21:40:36.981376273 +0000 UTC m=+1360.052081372" Mar 07 21:40:37.277155 master-0 kubenswrapper[16352]: I0307 21:40:37.277001 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-h69l5" podStartSLOduration=4.82639163 podStartE2EDuration="17.276973889s" podCreationTimestamp="2026-03-07 21:40:20 +0000 UTC" firstStartedPulling="2026-03-07 21:40:21.658160193 +0000 UTC m=+1344.728865252" lastFinishedPulling="2026-03-07 21:40:34.108742452 +0000 UTC m=+1357.179447511" observedRunningTime="2026-03-07 21:40:37.264312605 +0000 UTC m=+1360.335017684" watchObservedRunningTime="2026-03-07 21:40:37.276973889 +0000 UTC m=+1360.347678948" Mar 07 21:40:37.309604 master-0 kubenswrapper[16352]: I0307 21:40:37.309517 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4d2620a2-19d9-4543-922c-dc7951734958","Type":"ContainerStarted","Data":"831b8bfe86eef604f21234ea36051d147718c53685172c2c1b6643f3cbe45146"} Mar 07 21:40:37.314973 master-0 kubenswrapper[16352]: I0307 21:40:37.314783 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csxfx" event={"ID":"cd69bc9d-5b53-4868-bfab-b3956e54600d","Type":"ContainerStarted","Data":"9f74a9972dacff0dd6b5231688a3eb901ea9d5f5bb99a922e943383fd7729169"} Mar 07 21:40:37.909982 master-0 kubenswrapper[16352]: I0307 21:40:37.909857 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-wptpb" podStartSLOduration=12.335514031 podStartE2EDuration="26.9098355s" podCreationTimestamp="2026-03-07 21:40:11 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.477465638 +0000 UTC m=+1342.548170697" lastFinishedPulling="2026-03-07 21:40:34.051787097 +0000 UTC m=+1357.122492166" observedRunningTime="2026-03-07 21:40:37.903739323 +0000 UTC m=+1360.974444412" watchObservedRunningTime="2026-03-07 21:40:37.9098355 +0000 UTC m=+1360.980540559" Mar 07 21:40:38.275881 master-0 kubenswrapper[16352]: I0307 21:40:38.275655 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:38.335960 master-0 kubenswrapper[16352]: I0307 21:40:38.335860 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-csxfx" event={"ID":"cd69bc9d-5b53-4868-bfab-b3956e54600d","Type":"ContainerStarted","Data":"74e4443c46cca8d0e8d0a347f402e8b49bee94e706ec239fa7af7ee0bebf538b"} Mar 07 21:40:38.336608 master-0 kubenswrapper[16352]: I0307 21:40:38.336508 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:38.825346 master-0 kubenswrapper[16352]: I0307 21:40:38.825224 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-csxfx" podStartSLOduration=14.405313951 podStartE2EDuration="27.825200493s" podCreationTimestamp="2026-03-07 21:40:11 +0000 UTC" firstStartedPulling="2026-03-07 21:40:20.539551453 +0000 UTC m=+1343.610256512" lastFinishedPulling="2026-03-07 21:40:33.959437985 +0000 UTC m=+1357.030143054" observedRunningTime="2026-03-07 21:40:38.820802996 +0000 UTC m=+1361.891508055" watchObservedRunningTime="2026-03-07 21:40:38.825200493 +0000 UTC m=+1361.895905552" Mar 07 21:40:39.276156 master-0 kubenswrapper[16352]: I0307 21:40:39.276057 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:39.335820 master-0 kubenswrapper[16352]: I0307 21:40:39.335728 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 07 21:40:39.348097 master-0 kubenswrapper[16352]: I0307 21:40:39.347995 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:39.349408 master-0 kubenswrapper[16352]: I0307 21:40:39.349384 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:40:39.894714 master-0 kubenswrapper[16352]: I0307 21:40:39.892737 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:39.894714 master-0 kubenswrapper[16352]: I0307 21:40:39.894502 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:39.949706 master-0 kubenswrapper[16352]: I0307 21:40:39.946978 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 07 21:40:40.282580 master-0 kubenswrapper[16352]: I0307 21:40:40.282489 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 07 21:40:40.283211 master-0 kubenswrapper[16352]: E0307 21:40:40.283179 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" Mar 07 21:40:40.283211 master-0 kubenswrapper[16352]: I0307 21:40:40.283206 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" Mar 07 21:40:40.283308 master-0 kubenswrapper[16352]: E0307 21:40:40.283276 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" Mar 07 21:40:40.283308 master-0 kubenswrapper[16352]: I0307 21:40:40.283285 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" Mar 07 21:40:40.283308 master-0 kubenswrapper[16352]: E0307 21:40:40.283292 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="init" Mar 07 21:40:40.283308 master-0 kubenswrapper[16352]: I0307 21:40:40.283299 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="init" Mar 07 21:40:40.283308 master-0 kubenswrapper[16352]: E0307 21:40:40.283312 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="init" Mar 07 21:40:40.283484 master-0 kubenswrapper[16352]: I0307 21:40:40.283322 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="init" Mar 07 21:40:40.283484 master-0 kubenswrapper[16352]: E0307 21:40:40.283334 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="dnsmasq-dns" Mar 07 21:40:40.283484 master-0 kubenswrapper[16352]: I0307 21:40:40.283341 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="dnsmasq-dns" Mar 07 21:40:40.283484 master-0 kubenswrapper[16352]: E0307 21:40:40.283360 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="init" Mar 07 21:40:40.283484 master-0 kubenswrapper[16352]: I0307 21:40:40.283366 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="init" Mar 07 21:40:40.283670 master-0 kubenswrapper[16352]: I0307 21:40:40.283629 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="36578f90-945d-4f93-ac4e-346ff30e9119" containerName="dnsmasq-dns" Mar 07 21:40:40.283724 master-0 kubenswrapper[16352]: I0307 21:40:40.283699 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="09adeffb-21b4-4651-910f-1588cd295a27" containerName="dnsmasq-dns" Mar 07 21:40:40.283724 master-0 kubenswrapper[16352]: I0307 21:40:40.283713 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1ce916-8e43-4899-9c97-9aba5f6c5679" containerName="dnsmasq-dns" Mar 07 21:40:40.285734 master-0 kubenswrapper[16352]: I0307 21:40:40.285651 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 21:40:40.289284 master-0 kubenswrapper[16352]: I0307 21:40:40.289099 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 07 21:40:40.289916 master-0 kubenswrapper[16352]: I0307 21:40:40.289858 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 07 21:40:40.290040 master-0 kubenswrapper[16352]: I0307 21:40:40.289866 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 07 21:40:40.301671 master-0 kubenswrapper[16352]: I0307 21:40:40.301619 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 21:40:40.377997 master-0 kubenswrapper[16352]: I0307 21:40:40.377915 16352 generic.go:334] "Generic (PLEG): container finished" podID="90e6075f-5d67-4d68-a26b-621590c4ca33" containerID="5f5f501252082b7a7e4ff3ba03a636a2e7bd57c88f5aa9e35836b543747836b2" exitCode=0 Mar 07 21:40:40.377997 master-0 kubenswrapper[16352]: I0307 21:40:40.378000 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"90e6075f-5d67-4d68-a26b-621590c4ca33","Type":"ContainerDied","Data":"5f5f501252082b7a7e4ff3ba03a636a2e7bd57c88f5aa9e35836b543747836b2"} Mar 07 21:40:40.393211 master-0 kubenswrapper[16352]: I0307 21:40:40.393167 16352 generic.go:334] "Generic (PLEG): container finished" podID="1bc36cee-aa13-4fd2-873c-892c54978add" containerID="000ace5bf1eafbe043854f20adb830ad238e2205964aa3e41e7fd937759ab021" exitCode=0 Mar 07 21:40:40.393352 master-0 kubenswrapper[16352]: I0307 21:40:40.393321 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1bc36cee-aa13-4fd2-873c-892c54978add","Type":"ContainerDied","Data":"000ace5bf1eafbe043854f20adb830ad238e2205964aa3e41e7fd937759ab021"} Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.415796 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-scripts\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.415928 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hv4f4\" (UniqueName: \"kubernetes.io/projected/822f550c-6bf0-4485-9740-6045f44dff4e-kube-api-access-hv4f4\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.415975 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.416037 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.416124 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-config\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.416235 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.422776 master-0 kubenswrapper[16352]: I0307 21:40:40.416271 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518257 master-0 kubenswrapper[16352]: I0307 21:40:40.518183 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518382 master-0 kubenswrapper[16352]: I0307 21:40:40.518302 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-config\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518421 master-0 kubenswrapper[16352]: I0307 21:40:40.518409 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518455 master-0 kubenswrapper[16352]: I0307 21:40:40.518429 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518695 master-0 kubenswrapper[16352]: I0307 21:40:40.518646 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-scripts\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518871 master-0 kubenswrapper[16352]: I0307 21:40:40.518847 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hv4f4\" (UniqueName: \"kubernetes.io/projected/822f550c-6bf0-4485-9740-6045f44dff4e-kube-api-access-hv4f4\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.518920 master-0 kubenswrapper[16352]: I0307 21:40:40.518911 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.522054 master-0 kubenswrapper[16352]: I0307 21:40:40.521714 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.523587 master-0 kubenswrapper[16352]: I0307 21:40:40.523553 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-scripts\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.524203 master-0 kubenswrapper[16352]: I0307 21:40:40.524163 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/822f550c-6bf0-4485-9740-6045f44dff4e-config\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.528630 master-0 kubenswrapper[16352]: I0307 21:40:40.528530 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.529471 master-0 kubenswrapper[16352]: I0307 21:40:40.529270 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.532440 master-0 kubenswrapper[16352]: I0307 21:40:40.532403 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/822f550c-6bf0-4485-9740-6045f44dff4e-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.544313 master-0 kubenswrapper[16352]: I0307 21:40:40.544272 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hv4f4\" (UniqueName: \"kubernetes.io/projected/822f550c-6bf0-4485-9740-6045f44dff4e-kube-api-access-hv4f4\") pod \"ovn-northd-0\" (UID: \"822f550c-6bf0-4485-9740-6045f44dff4e\") " pod="openstack/ovn-northd-0" Mar 07 21:40:40.665346 master-0 kubenswrapper[16352]: I0307 21:40:40.665273 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 07 21:40:41.143032 master-0 kubenswrapper[16352]: I0307 21:40:41.140141 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 07 21:40:41.148361 master-0 kubenswrapper[16352]: W0307 21:40:41.148225 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod822f550c_6bf0_4485_9740_6045f44dff4e.slice/crio-7cb797deac2b422e86d468e9d181eac80e952c57a14920fac757c9f4bd578120 WatchSource:0}: Error finding container 7cb797deac2b422e86d468e9d181eac80e952c57a14920fac757c9f4bd578120: Status 404 returned error can't find the container with id 7cb797deac2b422e86d468e9d181eac80e952c57a14920fac757c9f4bd578120 Mar 07 21:40:41.421451 master-0 kubenswrapper[16352]: I0307 21:40:41.421239 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"822f550c-6bf0-4485-9740-6045f44dff4e","Type":"ContainerStarted","Data":"7cb797deac2b422e86d468e9d181eac80e952c57a14920fac757c9f4bd578120"} Mar 07 21:40:41.425635 master-0 kubenswrapper[16352]: I0307 21:40:41.425595 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"1bc36cee-aa13-4fd2-873c-892c54978add","Type":"ContainerStarted","Data":"d2d67ff5da0aa40f5703aebe4d05fc9240735bb3a54ac96703d554727b4ff6ec"} Mar 07 21:40:41.429988 master-0 kubenswrapper[16352]: I0307 21:40:41.429926 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"90e6075f-5d67-4d68-a26b-621590c4ca33","Type":"ContainerStarted","Data":"b6449419987a5838177df8e4cd17cfbf1d1ae062b63db91f2d4652d68e463fa5"} Mar 07 21:40:41.467477 master-0 kubenswrapper[16352]: I0307 21:40:41.467366 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=26.982772525 podStartE2EDuration="41.467335804s" podCreationTimestamp="2026-03-07 21:40:00 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.564634066 +0000 UTC m=+1342.635339135" lastFinishedPulling="2026-03-07 21:40:34.049197315 +0000 UTC m=+1357.119902414" observedRunningTime="2026-03-07 21:40:41.46049892 +0000 UTC m=+1364.531203979" watchObservedRunningTime="2026-03-07 21:40:41.467335804 +0000 UTC m=+1364.538040893" Mar 07 21:40:41.510261 master-0 kubenswrapper[16352]: I0307 21:40:41.510076 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=24.980953492 podStartE2EDuration="39.510033309s" podCreationTimestamp="2026-03-07 21:40:02 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.428197127 +0000 UTC m=+1342.498902196" lastFinishedPulling="2026-03-07 21:40:33.957276944 +0000 UTC m=+1357.027982013" observedRunningTime="2026-03-07 21:40:41.497105898 +0000 UTC m=+1364.567810957" watchObservedRunningTime="2026-03-07 21:40:41.510033309 +0000 UTC m=+1364.580738458" Mar 07 21:40:43.453762 master-0 kubenswrapper[16352]: I0307 21:40:43.453641 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"822f550c-6bf0-4485-9740-6045f44dff4e","Type":"ContainerStarted","Data":"e511253d790eb9637ca635c18a5c470ffa3acb918bf6fc1d72e6d2fda032c374"} Mar 07 21:40:43.454605 master-0 kubenswrapper[16352]: I0307 21:40:43.453771 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"822f550c-6bf0-4485-9740-6045f44dff4e","Type":"ContainerStarted","Data":"576be847f105388df1ceb4f1f951f5137479647d26eecca1488cc809256423a3"} Mar 07 21:40:43.454605 master-0 kubenswrapper[16352]: I0307 21:40:43.453957 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 07 21:40:43.506203 master-0 kubenswrapper[16352]: I0307 21:40:43.506052 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.40912506 podStartE2EDuration="3.50601517s" podCreationTimestamp="2026-03-07 21:40:40 +0000 UTC" firstStartedPulling="2026-03-07 21:40:41.153539841 +0000 UTC m=+1364.224244900" lastFinishedPulling="2026-03-07 21:40:42.250429941 +0000 UTC m=+1365.321135010" observedRunningTime="2026-03-07 21:40:43.494324479 +0000 UTC m=+1366.565029538" watchObservedRunningTime="2026-03-07 21:40:43.50601517 +0000 UTC m=+1366.576720239" Mar 07 21:40:44.182552 master-0 kubenswrapper[16352]: I0307 21:40:44.182409 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 07 21:40:46.264288 master-0 kubenswrapper[16352]: I0307 21:40:46.264169 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:40:46.268493 master-0 kubenswrapper[16352]: I0307 21:40:46.266328 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.283901 master-0 kubenswrapper[16352]: I0307 21:40:46.283781 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:40:46.309828 master-0 kubenswrapper[16352]: I0307 21:40:46.307752 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.309828 master-0 kubenswrapper[16352]: I0307 21:40:46.307814 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.309828 master-0 kubenswrapper[16352]: I0307 21:40:46.307871 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.309828 master-0 kubenswrapper[16352]: I0307 21:40:46.307934 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.309828 master-0 kubenswrapper[16352]: I0307 21:40:46.308252 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kmkw6\" (UniqueName: \"kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.416119 master-0 kubenswrapper[16352]: I0307 21:40:46.416065 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kmkw6\" (UniqueName: \"kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.416871 master-0 kubenswrapper[16352]: I0307 21:40:46.416852 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.417099 master-0 kubenswrapper[16352]: I0307 21:40:46.417080 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.417353 master-0 kubenswrapper[16352]: I0307 21:40:46.417336 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.418498 master-0 kubenswrapper[16352]: I0307 21:40:46.418388 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.418633 master-0 kubenswrapper[16352]: I0307 21:40:46.418171 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.418754 master-0 kubenswrapper[16352]: I0307 21:40:46.418734 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.419230 master-0 kubenswrapper[16352]: I0307 21:40:46.419185 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.419480 master-0 kubenswrapper[16352]: I0307 21:40:46.419464 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.437190 master-0 kubenswrapper[16352]: I0307 21:40:46.437147 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kmkw6\" (UniqueName: \"kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6\") pod \"dnsmasq-dns-d6c6c44c5-7fbfp\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:46.618793 master-0 kubenswrapper[16352]: I0307 21:40:46.618652 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:47.109773 master-0 kubenswrapper[16352]: W0307 21:40:47.109615 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf910f111_201a_459a_a4e6_3dd57dc69897.slice/crio-45c0ccfd744c65b0f0b187f38a2782650198e4e99723e17a8a57dc40a48429e4 WatchSource:0}: Error finding container 45c0ccfd744c65b0f0b187f38a2782650198e4e99723e17a8a57dc40a48429e4: Status 404 returned error can't find the container with id 45c0ccfd744c65b0f0b187f38a2782650198e4e99723e17a8a57dc40a48429e4 Mar 07 21:40:47.118551 master-0 kubenswrapper[16352]: I0307 21:40:47.118487 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:40:47.509478 master-0 kubenswrapper[16352]: I0307 21:40:47.509337 16352 generic.go:334] "Generic (PLEG): container finished" podID="f910f111-201a-459a-a4e6-3dd57dc69897" containerID="0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818" exitCode=0 Mar 07 21:40:47.510233 master-0 kubenswrapper[16352]: I0307 21:40:47.510199 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" event={"ID":"f910f111-201a-459a-a4e6-3dd57dc69897","Type":"ContainerDied","Data":"0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818"} Mar 07 21:40:47.510358 master-0 kubenswrapper[16352]: I0307 21:40:47.510339 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" event={"ID":"f910f111-201a-459a-a4e6-3dd57dc69897","Type":"ContainerStarted","Data":"45c0ccfd744c65b0f0b187f38a2782650198e4e99723e17a8a57dc40a48429e4"} Mar 07 21:40:48.302105 master-0 kubenswrapper[16352]: I0307 21:40:48.302018 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 07 21:40:48.327645 master-0 kubenswrapper[16352]: I0307 21:40:48.327557 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 21:40:48.340144 master-0 kubenswrapper[16352]: I0307 21:40:48.340052 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 07 21:40:48.340620 master-0 kubenswrapper[16352]: I0307 21:40:48.340582 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 07 21:40:48.341195 master-0 kubenswrapper[16352]: I0307 21:40:48.341147 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 07 21:40:48.361667 master-0 kubenswrapper[16352]: I0307 21:40:48.360454 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.525346 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.525499 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-lock\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.525645 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-c7378389-6847-472c-b514-9e1417dd82a9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^943640de-8f7f-4f74-aae2-cf8bb21c498a\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.526038 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.526416 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zp8h2\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-kube-api-access-zp8h2\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.526533 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-cache\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.529749 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" event={"ID":"f910f111-201a-459a-a4e6-3dd57dc69897","Type":"ContainerStarted","Data":"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5"} Mar 07 21:40:48.530176 master-0 kubenswrapper[16352]: I0307 21:40:48.529992 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:48.569774 master-0 kubenswrapper[16352]: I0307 21:40:48.569454 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" podStartSLOduration=2.569427572 podStartE2EDuration="2.569427572s" podCreationTimestamp="2026-03-07 21:40:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:40:48.550741494 +0000 UTC m=+1371.621446573" watchObservedRunningTime="2026-03-07 21:40:48.569427572 +0000 UTC m=+1371.640132631" Mar 07 21:40:48.628284 master-0 kubenswrapper[16352]: I0307 21:40:48.628188 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zp8h2\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-kube-api-access-zp8h2\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.628284 master-0 kubenswrapper[16352]: I0307 21:40:48.628272 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-cache\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.628284 master-0 kubenswrapper[16352]: I0307 21:40:48.628299 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.628713 master-0 kubenswrapper[16352]: I0307 21:40:48.628359 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-lock\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.628713 master-0 kubenswrapper[16352]: I0307 21:40:48.628390 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-c7378389-6847-472c-b514-9e1417dd82a9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^943640de-8f7f-4f74-aae2-cf8bb21c498a\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.628713 master-0 kubenswrapper[16352]: E0307 21:40:48.628607 16352 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 21:40:48.628713 master-0 kubenswrapper[16352]: E0307 21:40:48.628648 16352 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 21:40:48.628914 master-0 kubenswrapper[16352]: E0307 21:40:48.628728 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift podName:c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62 nodeName:}" failed. No retries permitted until 2026-03-07 21:40:49.128707205 +0000 UTC m=+1372.199412264 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift") pod "swift-storage-0" (UID: "c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62") : configmap "swift-ring-files" not found Mar 07 21:40:48.628983 master-0 kubenswrapper[16352]: I0307 21:40:48.628891 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.632723 master-0 kubenswrapper[16352]: I0307 21:40:48.629046 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-cache\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.632723 master-0 kubenswrapper[16352]: I0307 21:40:48.629858 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-lock\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.632723 master-0 kubenswrapper[16352]: I0307 21:40:48.630910 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:40:48.632723 master-0 kubenswrapper[16352]: I0307 21:40:48.630933 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-c7378389-6847-472c-b514-9e1417dd82a9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^943640de-8f7f-4f74-aae2-cf8bb21c498a\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/008fce3a65f33ae0435c3ddd3bcbbd08698f0742b61d05daf0cf7760959d0008/globalmount\"" pod="openstack/swift-storage-0" Mar 07 21:40:48.633092 master-0 kubenswrapper[16352]: I0307 21:40:48.632984 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:48.654739 master-0 kubenswrapper[16352]: I0307 21:40:48.654659 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zp8h2\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-kube-api-access-zp8h2\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:49.146278 master-0 kubenswrapper[16352]: I0307 21:40:49.146204 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:49.147222 master-0 kubenswrapper[16352]: E0307 21:40:49.146521 16352 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 21:40:49.147222 master-0 kubenswrapper[16352]: E0307 21:40:49.146538 16352 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 21:40:49.147222 master-0 kubenswrapper[16352]: E0307 21:40:49.146606 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift podName:c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62 nodeName:}" failed. No retries permitted until 2026-03-07 21:40:50.146591166 +0000 UTC m=+1373.217296225 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift") pod "swift-storage-0" (UID: "c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62") : configmap "swift-ring-files" not found Mar 07 21:40:49.404222 master-0 kubenswrapper[16352]: I0307 21:40:49.404047 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 07 21:40:49.404222 master-0 kubenswrapper[16352]: I0307 21:40:49.404156 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 07 21:40:49.519789 master-0 kubenswrapper[16352]: I0307 21:40:49.519710 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 07 21:40:49.631221 master-0 kubenswrapper[16352]: I0307 21:40:49.631146 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 07 21:40:50.055289 master-0 kubenswrapper[16352]: I0307 21:40:50.055146 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-c7378389-6847-472c-b514-9e1417dd82a9\" (UniqueName: \"kubernetes.io/csi/topolvm.io^943640de-8f7f-4f74-aae2-cf8bb21c498a\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:50.195721 master-0 kubenswrapper[16352]: I0307 21:40:50.195603 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:50.195998 master-0 kubenswrapper[16352]: I0307 21:40:50.195855 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:50.197751 master-0 kubenswrapper[16352]: I0307 21:40:50.197620 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:50.197876 master-0 kubenswrapper[16352]: E0307 21:40:50.197829 16352 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 21:40:50.197876 master-0 kubenswrapper[16352]: E0307 21:40:50.197864 16352 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 21:40:50.197976 master-0 kubenswrapper[16352]: E0307 21:40:50.197926 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift podName:c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62 nodeName:}" failed. No retries permitted until 2026-03-07 21:40:52.197906032 +0000 UTC m=+1375.268611091 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift") pod "swift-storage-0" (UID: "c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62") : configmap "swift-ring-files" not found Mar 07 21:40:50.314894 master-0 kubenswrapper[16352]: I0307 21:40:50.314652 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:50.674259 master-0 kubenswrapper[16352]: I0307 21:40:50.674160 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 07 21:40:50.838043 master-0 kubenswrapper[16352]: I0307 21:40:50.837980 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-gshbr"] Mar 07 21:40:50.840413 master-0 kubenswrapper[16352]: I0307 21:40:50.840367 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:50.844461 master-0 kubenswrapper[16352]: I0307 21:40:50.844417 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 07 21:40:50.874879 master-0 kubenswrapper[16352]: I0307 21:40:50.874788 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gshbr"] Mar 07 21:40:50.926381 master-0 kubenswrapper[16352]: I0307 21:40:50.926210 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8prw\" (UniqueName: \"kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:50.926381 master-0 kubenswrapper[16352]: I0307 21:40:50.926274 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.028828 master-0 kubenswrapper[16352]: I0307 21:40:51.028650 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.029253 master-0 kubenswrapper[16352]: I0307 21:40:51.029080 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8prw\" (UniqueName: \"kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.029837 master-0 kubenswrapper[16352]: I0307 21:40:51.029782 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.048540 master-0 kubenswrapper[16352]: I0307 21:40:51.047890 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8prw\" (UniqueName: \"kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw\") pod \"root-account-create-update-gshbr\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.196891 master-0 kubenswrapper[16352]: I0307 21:40:51.196007 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:51.350253 master-0 kubenswrapper[16352]: I0307 21:40:51.350150 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-gr69d"] Mar 07 21:40:51.360020 master-0 kubenswrapper[16352]: I0307 21:40:51.359946 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.367474 master-0 kubenswrapper[16352]: I0307 21:40:51.363512 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 07 21:40:51.367474 master-0 kubenswrapper[16352]: I0307 21:40:51.363905 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 07 21:40:51.367474 master-0 kubenswrapper[16352]: I0307 21:40:51.364072 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 07 21:40:51.424547 master-0 kubenswrapper[16352]: I0307 21:40:51.424438 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gr69d"] Mar 07 21:40:51.425781 master-0 kubenswrapper[16352]: E0307 21:40:51.425741 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-q6zf9 ring-data-devices scripts swiftconf], unattached volumes=[], failed to process volumes=[combined-ca-bundle dispersionconf etc-swift kube-api-access-q6zf9 ring-data-devices scripts swiftconf]: context canceled" pod="openstack/swift-ring-rebalance-gr69d" podUID="7f29294c-3986-4f30-bcce-cc91d2f37b7a" Mar 07 21:40:51.443148 master-0 kubenswrapper[16352]: I0307 21:40:51.443000 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-796n4"] Mar 07 21:40:51.447182 master-0 kubenswrapper[16352]: I0307 21:40:51.445223 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.488838 master-0 kubenswrapper[16352]: I0307 21:40:51.488742 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-796n4"] Mar 07 21:40:51.490674 master-0 kubenswrapper[16352]: I0307 21:40:51.488972 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gr69d"] Mar 07 21:40:51.544090 master-0 kubenswrapper[16352]: I0307 21:40:51.544005 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544487 master-0 kubenswrapper[16352]: I0307 21:40:51.544110 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544487 master-0 kubenswrapper[16352]: I0307 21:40:51.544344 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.544487 master-0 kubenswrapper[16352]: I0307 21:40:51.544397 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.544487 master-0 kubenswrapper[16352]: I0307 21:40:51.544428 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6zf9\" (UniqueName: \"kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544487 master-0 kubenswrapper[16352]: I0307 21:40:51.544456 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544510 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544644 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544662 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r2vt\" (UniqueName: \"kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544703 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544731 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.544842 master-0 kubenswrapper[16352]: I0307 21:40:51.544781 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.547630 master-0 kubenswrapper[16352]: I0307 21:40:51.547602 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.547705 master-0 kubenswrapper[16352]: I0307 21:40:51.547640 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.570101 master-0 kubenswrapper[16352]: I0307 21:40:51.570005 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.582866 master-0 kubenswrapper[16352]: I0307 21:40:51.582782 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.658368 master-0 kubenswrapper[16352]: I0307 21:40:51.658261 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9r2vt\" (UniqueName: \"kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.658368 master-0 kubenswrapper[16352]: I0307 21:40:51.658346 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.658796 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.658937 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.659085 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.659118 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.659152 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.659273 master-0 kubenswrapper[16352]: I0307 21:40:51.659249 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659384 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659424 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659457 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659496 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6zf9\" (UniqueName: \"kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659548 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.659750 master-0 kubenswrapper[16352]: I0307 21:40:51.659673 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.660607 master-0 kubenswrapper[16352]: I0307 21:40:51.660555 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.661790 master-0 kubenswrapper[16352]: I0307 21:40:51.660783 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.661790 master-0 kubenswrapper[16352]: I0307 21:40:51.660818 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.661790 master-0 kubenswrapper[16352]: I0307 21:40:51.661040 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.661790 master-0 kubenswrapper[16352]: I0307 21:40:51.661373 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.662255 master-0 kubenswrapper[16352]: I0307 21:40:51.661839 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.666108 master-0 kubenswrapper[16352]: I0307 21:40:51.666066 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.666384 master-0 kubenswrapper[16352]: I0307 21:40:51.666330 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.667130 master-0 kubenswrapper[16352]: I0307 21:40:51.667066 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.667333 master-0 kubenswrapper[16352]: I0307 21:40:51.667281 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.669985 master-0 kubenswrapper[16352]: I0307 21:40:51.669935 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.671125 master-0 kubenswrapper[16352]: I0307 21:40:51.671081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.677549 master-0 kubenswrapper[16352]: I0307 21:40:51.677509 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6zf9\" (UniqueName: \"kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9\") pod \"swift-ring-rebalance-gr69d\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:51.680125 master-0 kubenswrapper[16352]: I0307 21:40:51.680079 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r2vt\" (UniqueName: \"kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt\") pod \"swift-ring-rebalance-796n4\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.761412 master-0 kubenswrapper[16352]: I0307 21:40:51.761266 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q6zf9\" (UniqueName: \"kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761641 master-0 kubenswrapper[16352]: I0307 21:40:51.761466 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761641 master-0 kubenswrapper[16352]: I0307 21:40:51.761552 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761641 master-0 kubenswrapper[16352]: I0307 21:40:51.761576 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761641 master-0 kubenswrapper[16352]: I0307 21:40:51.761615 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761798 master-0 kubenswrapper[16352]: I0307 21:40:51.761732 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.761798 master-0 kubenswrapper[16352]: I0307 21:40:51.761761 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf\") pod \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\" (UID: \"7f29294c-3986-4f30-bcce-cc91d2f37b7a\") " Mar 07 21:40:51.762874 master-0 kubenswrapper[16352]: I0307 21:40:51.762797 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:40:51.763050 master-0 kubenswrapper[16352]: I0307 21:40:51.762985 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts" (OuterVolumeSpecName: "scripts") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:51.763344 master-0 kubenswrapper[16352]: I0307 21:40:51.763311 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:51.764640 master-0 kubenswrapper[16352]: I0307 21:40:51.764591 16352 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/7f29294c-3986-4f30-bcce-cc91d2f37b7a-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.764640 master-0 kubenswrapper[16352]: I0307 21:40:51.764633 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.764842 master-0 kubenswrapper[16352]: I0307 21:40:51.764646 16352 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/7f29294c-3986-4f30-bcce-cc91d2f37b7a-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.767233 master-0 kubenswrapper[16352]: I0307 21:40:51.767193 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:40:51.767871 master-0 kubenswrapper[16352]: I0307 21:40:51.767826 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:40:51.768289 master-0 kubenswrapper[16352]: I0307 21:40:51.768225 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9" (OuterVolumeSpecName: "kube-api-access-q6zf9") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "kube-api-access-q6zf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:51.768360 master-0 kubenswrapper[16352]: I0307 21:40:51.768316 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7f29294c-3986-4f30-bcce-cc91d2f37b7a" (UID: "7f29294c-3986-4f30-bcce-cc91d2f37b7a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:40:51.813849 master-0 kubenswrapper[16352]: I0307 21:40:51.813764 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:40:51.839772 master-0 kubenswrapper[16352]: I0307 21:40:51.839671 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-gshbr"] Mar 07 21:40:51.849276 master-0 kubenswrapper[16352]: W0307 21:40:51.849194 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5464b2e2_8bd4_43e0_815a_9a1eeeb2b9a6.slice/crio-d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44 WatchSource:0}: Error finding container d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44: Status 404 returned error can't find the container with id d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44 Mar 07 21:40:51.869070 master-0 kubenswrapper[16352]: I0307 21:40:51.868996 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.869070 master-0 kubenswrapper[16352]: I0307 21:40:51.869056 16352 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.869070 master-0 kubenswrapper[16352]: I0307 21:40:51.869066 16352 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/7f29294c-3986-4f30-bcce-cc91d2f37b7a-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:51.869070 master-0 kubenswrapper[16352]: I0307 21:40:51.869080 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q6zf9\" (UniqueName: \"kubernetes.io/projected/7f29294c-3986-4f30-bcce-cc91d2f37b7a-kube-api-access-q6zf9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:52.278056 master-0 kubenswrapper[16352]: I0307 21:40:52.277866 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:52.278327 master-0 kubenswrapper[16352]: E0307 21:40:52.278132 16352 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 21:40:52.278327 master-0 kubenswrapper[16352]: E0307 21:40:52.278182 16352 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 21:40:52.278422 master-0 kubenswrapper[16352]: E0307 21:40:52.278360 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift podName:c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62 nodeName:}" failed. No retries permitted until 2026-03-07 21:40:56.27833174 +0000 UTC m=+1379.349036799 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift") pod "swift-storage-0" (UID: "c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62") : configmap "swift-ring-files" not found Mar 07 21:40:52.345091 master-0 kubenswrapper[16352]: I0307 21:40:52.345018 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-796n4"] Mar 07 21:40:52.589143 master-0 kubenswrapper[16352]: I0307 21:40:52.588788 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-796n4" event={"ID":"e74c1a05-5f07-4375-99f3-8f0db281543b","Type":"ContainerStarted","Data":"1b521828b4f8a411551598d711446fb74f76411b7f8e82d58e1125beecaf61bd"} Mar 07 21:40:52.593360 master-0 kubenswrapper[16352]: I0307 21:40:52.593268 16352 generic.go:334] "Generic (PLEG): container finished" podID="5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" containerID="a23d62c841cede038c0d7eaae41e12ad84dcebabbcd19790cc9a3deb18c0b09d" exitCode=0 Mar 07 21:40:52.593500 master-0 kubenswrapper[16352]: I0307 21:40:52.593372 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-gr69d" Mar 07 21:40:52.593500 master-0 kubenswrapper[16352]: I0307 21:40:52.593403 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gshbr" event={"ID":"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6","Type":"ContainerDied","Data":"a23d62c841cede038c0d7eaae41e12ad84dcebabbcd19790cc9a3deb18c0b09d"} Mar 07 21:40:52.593500 master-0 kubenswrapper[16352]: I0307 21:40:52.593502 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gshbr" event={"ID":"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6","Type":"ContainerStarted","Data":"d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44"} Mar 07 21:40:52.751072 master-0 kubenswrapper[16352]: I0307 21:40:52.750957 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/swift-ring-rebalance-gr69d"] Mar 07 21:40:52.767963 master-0 kubenswrapper[16352]: I0307 21:40:52.767855 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/swift-ring-rebalance-gr69d"] Mar 07 21:40:53.215887 master-0 kubenswrapper[16352]: I0307 21:40:53.213371 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f29294c-3986-4f30-bcce-cc91d2f37b7a" path="/var/lib/kubelet/pods/7f29294c-3986-4f30-bcce-cc91d2f37b7a/volumes" Mar 07 21:40:54.434144 master-0 kubenswrapper[16352]: I0307 21:40:54.434033 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-8jd9b"] Mar 07 21:40:54.437919 master-0 kubenswrapper[16352]: I0307 21:40:54.437869 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.470325 master-0 kubenswrapper[16352]: I0307 21:40:54.468503 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8jd9b"] Mar 07 21:40:54.508988 master-0 kubenswrapper[16352]: I0307 21:40:54.508890 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-3631-account-create-update-8m8jf"] Mar 07 21:40:54.516555 master-0 kubenswrapper[16352]: I0307 21:40:54.516475 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.522081 master-0 kubenswrapper[16352]: I0307 21:40:54.521994 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 07 21:40:54.527586 master-0 kubenswrapper[16352]: I0307 21:40:54.527518 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3631-account-create-update-8m8jf"] Mar 07 21:40:54.581235 master-0 kubenswrapper[16352]: I0307 21:40:54.581127 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdhmw\" (UniqueName: \"kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.581986 master-0 kubenswrapper[16352]: I0307 21:40:54.581921 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.619577 master-0 kubenswrapper[16352]: I0307 21:40:54.619500 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-gshbr" event={"ID":"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6","Type":"ContainerDied","Data":"d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44"} Mar 07 21:40:54.619577 master-0 kubenswrapper[16352]: I0307 21:40:54.619570 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d18f215816be4fefe5310ddd3804759597d28cb9312ecbb86912eb922b0a3b44" Mar 07 21:40:54.659387 master-0 kubenswrapper[16352]: I0307 21:40:54.659324 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:54.687098 master-0 kubenswrapper[16352]: I0307 21:40:54.686934 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.687098 master-0 kubenswrapper[16352]: I0307 21:40:54.687051 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.687438 master-0 kubenswrapper[16352]: I0307 21:40:54.687105 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfnts\" (UniqueName: \"kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.687438 master-0 kubenswrapper[16352]: I0307 21:40:54.687241 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdhmw\" (UniqueName: \"kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.689424 master-0 kubenswrapper[16352]: I0307 21:40:54.689329 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.715671 master-0 kubenswrapper[16352]: I0307 21:40:54.715605 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdhmw\" (UniqueName: \"kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw\") pod \"glance-db-create-8jd9b\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.776066 master-0 kubenswrapper[16352]: I0307 21:40:54.775981 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8jd9b" Mar 07 21:40:54.789365 master-0 kubenswrapper[16352]: I0307 21:40:54.789307 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8prw\" (UniqueName: \"kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw\") pod \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " Mar 07 21:40:54.790518 master-0 kubenswrapper[16352]: I0307 21:40:54.790417 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts\") pod \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\" (UID: \"5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6\") " Mar 07 21:40:54.791540 master-0 kubenswrapper[16352]: I0307 21:40:54.791446 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" (UID: "5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:54.793880 master-0 kubenswrapper[16352]: I0307 21:40:54.793838 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.794773 master-0 kubenswrapper[16352]: I0307 21:40:54.794721 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfnts\" (UniqueName: \"kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.794985 master-0 kubenswrapper[16352]: I0307 21:40:54.794944 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw" (OuterVolumeSpecName: "kube-api-access-h8prw") pod "5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" (UID: "5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6"). InnerVolumeSpecName "kube-api-access-h8prw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:54.795057 master-0 kubenswrapper[16352]: I0307 21:40:54.795003 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.795650 master-0 kubenswrapper[16352]: I0307 21:40:54.795530 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8prw\" (UniqueName: \"kubernetes.io/projected/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-kube-api-access-h8prw\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:54.795650 master-0 kubenswrapper[16352]: I0307 21:40:54.795622 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:54.819019 master-0 kubenswrapper[16352]: I0307 21:40:54.818946 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfnts\" (UniqueName: \"kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts\") pod \"glance-3631-account-create-update-8m8jf\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:54.836326 master-0 kubenswrapper[16352]: I0307 21:40:54.836270 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:40:55.101374 master-0 kubenswrapper[16352]: I0307 21:40:55.101287 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-q8tkd"] Mar 07 21:40:55.102291 master-0 kubenswrapper[16352]: E0307 21:40:55.102244 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" containerName="mariadb-account-create-update" Mar 07 21:40:55.102291 master-0 kubenswrapper[16352]: I0307 21:40:55.102282 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" containerName="mariadb-account-create-update" Mar 07 21:40:55.102752 master-0 kubenswrapper[16352]: I0307 21:40:55.102660 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" containerName="mariadb-account-create-update" Mar 07 21:40:55.104088 master-0 kubenswrapper[16352]: I0307 21:40:55.104041 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.108756 master-0 kubenswrapper[16352]: I0307 21:40:55.108619 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2cqk7\" (UniqueName: \"kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.108880 master-0 kubenswrapper[16352]: I0307 21:40:55.108828 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.127459 master-0 kubenswrapper[16352]: I0307 21:40:55.113223 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8tkd"] Mar 07 21:40:55.180899 master-0 kubenswrapper[16352]: I0307 21:40:55.180790 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-c490-account-create-update-rc6gq"] Mar 07 21:40:55.183497 master-0 kubenswrapper[16352]: I0307 21:40:55.183416 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.186973 master-0 kubenswrapper[16352]: I0307 21:40:55.186928 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 07 21:40:55.219112 master-0 kubenswrapper[16352]: I0307 21:40:55.216271 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2cqk7\" (UniqueName: \"kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.219112 master-0 kubenswrapper[16352]: I0307 21:40:55.216492 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.219112 master-0 kubenswrapper[16352]: I0307 21:40:55.217633 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c490-account-create-update-rc6gq"] Mar 07 21:40:55.222110 master-0 kubenswrapper[16352]: I0307 21:40:55.219626 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.251794 master-0 kubenswrapper[16352]: I0307 21:40:55.251718 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2cqk7\" (UniqueName: \"kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7\") pod \"keystone-db-create-q8tkd\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.302111 master-0 kubenswrapper[16352]: I0307 21:40:55.301975 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-shqgh"] Mar 07 21:40:55.323068 master-0 kubenswrapper[16352]: I0307 21:40:55.320343 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.339586 master-0 kubenswrapper[16352]: I0307 21:40:55.339206 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-shqgh"] Mar 07 21:40:55.363856 master-0 kubenswrapper[16352]: I0307 21:40:55.353732 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6l9l\" (UniqueName: \"kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.363856 master-0 kubenswrapper[16352]: I0307 21:40:55.354289 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wcvd\" (UniqueName: \"kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.363856 master-0 kubenswrapper[16352]: I0307 21:40:55.354470 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.363856 master-0 kubenswrapper[16352]: I0307 21:40:55.354517 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.406807 master-0 kubenswrapper[16352]: I0307 21:40:55.406728 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-326e-account-create-update-g8dbq"] Mar 07 21:40:55.410309 master-0 kubenswrapper[16352]: I0307 21:40:55.410174 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.414110 master-0 kubenswrapper[16352]: I0307 21:40:55.414053 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 07 21:40:55.433559 master-0 kubenswrapper[16352]: I0307 21:40:55.433459 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-326e-account-create-update-g8dbq"] Mar 07 21:40:55.457276 master-0 kubenswrapper[16352]: I0307 21:40:55.457158 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.458935 master-0 kubenswrapper[16352]: I0307 21:40:55.457564 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wcvd\" (UniqueName: \"kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.458935 master-0 kubenswrapper[16352]: I0307 21:40:55.457823 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7px9h\" (UniqueName: \"kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.458935 master-0 kubenswrapper[16352]: I0307 21:40:55.457899 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.458935 master-0 kubenswrapper[16352]: I0307 21:40:55.457952 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.458935 master-0 kubenswrapper[16352]: I0307 21:40:55.458759 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b6l9l\" (UniqueName: \"kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.459095 master-0 kubenswrapper[16352]: I0307 21:40:55.458935 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.459414 master-0 kubenswrapper[16352]: I0307 21:40:55.459370 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8tkd" Mar 07 21:40:55.464800 master-0 kubenswrapper[16352]: I0307 21:40:55.462207 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.479173 master-0 kubenswrapper[16352]: I0307 21:40:55.479107 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wcvd\" (UniqueName: \"kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd\") pod \"keystone-c490-account-create-update-rc6gq\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.488167 master-0 kubenswrapper[16352]: I0307 21:40:55.486458 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6l9l\" (UniqueName: \"kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l\") pod \"placement-db-create-shqgh\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.519051 master-0 kubenswrapper[16352]: I0307 21:40:55.518980 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:40:55.562169 master-0 kubenswrapper[16352]: I0307 21:40:55.562034 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7px9h\" (UniqueName: \"kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.562559 master-0 kubenswrapper[16352]: I0307 21:40:55.562235 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.563249 master-0 kubenswrapper[16352]: I0307 21:40:55.563212 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.582948 master-0 kubenswrapper[16352]: I0307 21:40:55.582868 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7px9h\" (UniqueName: \"kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h\") pod \"placement-326e-account-create-update-g8dbq\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:55.654818 master-0 kubenswrapper[16352]: I0307 21:40:55.653930 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-gshbr" Mar 07 21:40:55.687934 master-0 kubenswrapper[16352]: I0307 21:40:55.686197 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shqgh" Mar 07 21:40:55.764958 master-0 kubenswrapper[16352]: I0307 21:40:55.764865 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:40:56.285140 master-0 kubenswrapper[16352]: I0307 21:40:56.285064 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:40:56.285455 master-0 kubenswrapper[16352]: E0307 21:40:56.285418 16352 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 07 21:40:56.285455 master-0 kubenswrapper[16352]: E0307 21:40:56.285435 16352 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 07 21:40:56.285635 master-0 kubenswrapper[16352]: E0307 21:40:56.285477 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift podName:c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62 nodeName:}" failed. No retries permitted until 2026-03-07 21:41:04.285461688 +0000 UTC m=+1387.356166747 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift") pod "swift-storage-0" (UID: "c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62") : configmap "swift-ring-files" not found Mar 07 21:40:56.621113 master-0 kubenswrapper[16352]: I0307 21:40:56.620867 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:40:56.990776 master-0 kubenswrapper[16352]: I0307 21:40:56.983467 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:56.990776 master-0 kubenswrapper[16352]: I0307 21:40:56.983795 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="dnsmasq-dns" containerID="cri-o://8d8f4443df157eeb09f76d18567cf29044279a3cac8b9cf89f2f0cdabcf68049" gracePeriod=10 Mar 07 21:40:57.255523 master-0 kubenswrapper[16352]: I0307 21:40:57.255384 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-gshbr"] Mar 07 21:40:57.279572 master-0 kubenswrapper[16352]: I0307 21:40:57.279457 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-gshbr"] Mar 07 21:40:57.549243 master-0 kubenswrapper[16352]: I0307 21:40:57.549165 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-8jd9b"] Mar 07 21:40:57.556314 master-0 kubenswrapper[16352]: I0307 21:40:57.555497 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-c490-account-create-update-rc6gq"] Mar 07 21:40:57.582692 master-0 kubenswrapper[16352]: I0307 21:40:57.582623 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 07 21:40:57.691152 master-0 kubenswrapper[16352]: I0307 21:40:57.691040 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c490-account-create-update-rc6gq" event={"ID":"7b1e7528-a696-43e8-b487-d981fb460467","Type":"ContainerStarted","Data":"350aef14ef463f18b5d495616832ae0176314ab9386071aca57c3b39bd748a01"} Mar 07 21:40:57.701791 master-0 kubenswrapper[16352]: I0307 21:40:57.694611 16352 generic.go:334] "Generic (PLEG): container finished" podID="167747ff-9ac4-42be-9894-549a51404415" containerID="8d8f4443df157eeb09f76d18567cf29044279a3cac8b9cf89f2f0cdabcf68049" exitCode=0 Mar 07 21:40:57.701791 master-0 kubenswrapper[16352]: I0307 21:40:57.694725 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" event={"ID":"167747ff-9ac4-42be-9894-549a51404415","Type":"ContainerDied","Data":"8d8f4443df157eeb09f76d18567cf29044279a3cac8b9cf89f2f0cdabcf68049"} Mar 07 21:40:57.701791 master-0 kubenswrapper[16352]: I0307 21:40:57.696346 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8jd9b" event={"ID":"5585c07c-5a03-46df-a14b-0cd2550e7ca1","Type":"ContainerStarted","Data":"d7e0d620714211e3b1ccdb93fea78d7f078e653786a1dda900a18e779558e3cc"} Mar 07 21:40:57.701791 master-0 kubenswrapper[16352]: I0307 21:40:57.699805 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-796n4" event={"ID":"e74c1a05-5f07-4375-99f3-8f0db281543b","Type":"ContainerStarted","Data":"51b420456f8034f930715129b10530859cb6a93859a69b9be2aacbae1a0a2dc7"} Mar 07 21:40:57.738755 master-0 kubenswrapper[16352]: I0307 21:40:57.738589 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-796n4" podStartSLOduration=2.597563988 podStartE2EDuration="6.738557018s" podCreationTimestamp="2026-03-07 21:40:51 +0000 UTC" firstStartedPulling="2026-03-07 21:40:52.35248596 +0000 UTC m=+1375.423191019" lastFinishedPulling="2026-03-07 21:40:56.49347899 +0000 UTC m=+1379.564184049" observedRunningTime="2026-03-07 21:40:57.73117091 +0000 UTC m=+1380.801875979" watchObservedRunningTime="2026-03-07 21:40:57.738557018 +0000 UTC m=+1380.809262097" Mar 07 21:40:57.868588 master-0 kubenswrapper[16352]: I0307 21:40:57.867636 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 07 21:40:57.915383 master-0 kubenswrapper[16352]: I0307 21:40:57.913964 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-326e-account-create-update-g8dbq"] Mar 07 21:40:57.954804 master-0 kubenswrapper[16352]: I0307 21:40:57.952872 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-q8tkd"] Mar 07 21:40:57.958575 master-0 kubenswrapper[16352]: I0307 21:40:57.958523 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 07 21:40:57.967035 master-0 kubenswrapper[16352]: I0307 21:40:57.966990 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-shqgh"] Mar 07 21:40:57.980475 master-0 kubenswrapper[16352]: I0307 21:40:57.980397 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-3631-account-create-update-8m8jf"] Mar 07 21:40:58.199134 master-0 kubenswrapper[16352]: I0307 21:40:58.199064 16352 trace.go:236] Trace[1189909610]: "Calculate volume metrics of swift for pod openstack/swift-storage-0" (07-Mar-2026 21:40:57.180) (total time: 1018ms): Mar 07 21:40:58.199134 master-0 kubenswrapper[16352]: Trace[1189909610]: [1.01857459s] [1.01857459s] END Mar 07 21:40:58.201816 master-0 kubenswrapper[16352]: I0307 21:40:58.199581 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:58.301611 master-0 kubenswrapper[16352]: I0307 21:40:58.301334 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb\") pod \"167747ff-9ac4-42be-9894-549a51404415\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " Mar 07 21:40:58.301611 master-0 kubenswrapper[16352]: I0307 21:40:58.301442 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc\") pod \"167747ff-9ac4-42be-9894-549a51404415\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " Mar 07 21:40:58.302146 master-0 kubenswrapper[16352]: I0307 21:40:58.301656 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config\") pod \"167747ff-9ac4-42be-9894-549a51404415\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " Mar 07 21:40:58.302146 master-0 kubenswrapper[16352]: I0307 21:40:58.301711 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ck4sg\" (UniqueName: \"kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg\") pod \"167747ff-9ac4-42be-9894-549a51404415\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " Mar 07 21:40:58.302146 master-0 kubenswrapper[16352]: I0307 21:40:58.301830 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb\") pod \"167747ff-9ac4-42be-9894-549a51404415\" (UID: \"167747ff-9ac4-42be-9894-549a51404415\") " Mar 07 21:40:58.326820 master-0 kubenswrapper[16352]: I0307 21:40:58.326726 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg" (OuterVolumeSpecName: "kube-api-access-ck4sg") pod "167747ff-9ac4-42be-9894-549a51404415" (UID: "167747ff-9ac4-42be-9894-549a51404415"). InnerVolumeSpecName "kube-api-access-ck4sg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:40:58.364721 master-0 kubenswrapper[16352]: I0307 21:40:58.362453 16352 trace.go:236] Trace[819461846]: "Calculate volume metrics of ovndbcluster-sb-etc-ovn for pod openstack/ovsdbserver-sb-0" (07-Mar-2026 21:40:57.182) (total time: 1179ms): Mar 07 21:40:58.364721 master-0 kubenswrapper[16352]: Trace[819461846]: [1.17981907s] [1.17981907s] END Mar 07 21:40:58.405908 master-0 kubenswrapper[16352]: I0307 21:40:58.405825 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ck4sg\" (UniqueName: \"kubernetes.io/projected/167747ff-9ac4-42be-9894-549a51404415-kube-api-access-ck4sg\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:58.442030 master-0 kubenswrapper[16352]: I0307 21:40:58.441935 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "167747ff-9ac4-42be-9894-549a51404415" (UID: "167747ff-9ac4-42be-9894-549a51404415"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:58.451777 master-0 kubenswrapper[16352]: I0307 21:40:58.450989 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "167747ff-9ac4-42be-9894-549a51404415" (UID: "167747ff-9ac4-42be-9894-549a51404415"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:58.462124 master-0 kubenswrapper[16352]: I0307 21:40:58.461946 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config" (OuterVolumeSpecName: "config") pod "167747ff-9ac4-42be-9894-549a51404415" (UID: "167747ff-9ac4-42be-9894-549a51404415"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:58.474880 master-0 kubenswrapper[16352]: I0307 21:40:58.473146 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "167747ff-9ac4-42be-9894-549a51404415" (UID: "167747ff-9ac4-42be-9894-549a51404415"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:40:58.502936 master-0 kubenswrapper[16352]: I0307 21:40:58.502879 16352 trace.go:236] Trace[90832227]: "Calculate volume metrics of ovndbcluster-nb-etc-ovn for pod openstack/ovsdbserver-nb-0" (07-Mar-2026 21:40:57.178) (total time: 1324ms): Mar 07 21:40:58.502936 master-0 kubenswrapper[16352]: Trace[90832227]: [1.324281818s] [1.324281818s] END Mar 07 21:40:58.512517 master-0 kubenswrapper[16352]: I0307 21:40:58.512392 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:58.512517 master-0 kubenswrapper[16352]: I0307 21:40:58.512442 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:58.512517 master-0 kubenswrapper[16352]: I0307 21:40:58.512453 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:58.512517 master-0 kubenswrapper[16352]: I0307 21:40:58.512463 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/167747ff-9ac4-42be-9894-549a51404415-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:40:58.735422 master-0 kubenswrapper[16352]: I0307 21:40:58.731151 16352 generic.go:334] "Generic (PLEG): container finished" podID="732d7db0-01b4-4187-92d9-01e6a04f91f8" containerID="01516de94ecdfbaf37dc0c71f462dcf88bea038636c0475dffbc097dacbadf8c" exitCode=0 Mar 07 21:40:58.735422 master-0 kubenswrapper[16352]: I0307 21:40:58.731255 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8tkd" event={"ID":"732d7db0-01b4-4187-92d9-01e6a04f91f8","Type":"ContainerDied","Data":"01516de94ecdfbaf37dc0c71f462dcf88bea038636c0475dffbc097dacbadf8c"} Mar 07 21:40:58.735422 master-0 kubenswrapper[16352]: I0307 21:40:58.731296 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8tkd" event={"ID":"732d7db0-01b4-4187-92d9-01e6a04f91f8","Type":"ContainerStarted","Data":"a0073b97ffad389b0581863ce12e3a49cf544d7fc2be66230ca596c99368f692"} Mar 07 21:40:58.736213 master-0 kubenswrapper[16352]: I0307 21:40:58.735410 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" Mar 07 21:40:58.736213 master-0 kubenswrapper[16352]: I0307 21:40:58.735673 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-58dc6c9559-pt84w" event={"ID":"167747ff-9ac4-42be-9894-549a51404415","Type":"ContainerDied","Data":"62c1d194d1d62073a66561eaabbdd97e0272bdad0ec866f0d1f06d74209fbae3"} Mar 07 21:40:58.736213 master-0 kubenswrapper[16352]: I0307 21:40:58.735769 16352 scope.go:117] "RemoveContainer" containerID="8d8f4443df157eeb09f76d18567cf29044279a3cac8b9cf89f2f0cdabcf68049" Mar 07 21:40:58.739949 master-0 kubenswrapper[16352]: I0307 21:40:58.739072 16352 generic.go:334] "Generic (PLEG): container finished" podID="5585c07c-5a03-46df-a14b-0cd2550e7ca1" containerID="3046e1b94ffdc03a870af1c7395289a410b6ce16ca34eeedf5880171425e18dd" exitCode=0 Mar 07 21:40:58.739949 master-0 kubenswrapper[16352]: I0307 21:40:58.739136 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8jd9b" event={"ID":"5585c07c-5a03-46df-a14b-0cd2550e7ca1","Type":"ContainerDied","Data":"3046e1b94ffdc03a870af1c7395289a410b6ce16ca34eeedf5880171425e18dd"} Mar 07 21:40:58.745195 master-0 kubenswrapper[16352]: I0307 21:40:58.742899 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shqgh" event={"ID":"16841f85-d2fc-46a7-8c09-c957cb2cfb4f","Type":"ContainerStarted","Data":"d05854d3fa0625cdd2ed64a244c92ef91acfa81fc8d83d589e8979a8f3bbb33b"} Mar 07 21:40:58.745195 master-0 kubenswrapper[16352]: I0307 21:40:58.742928 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shqgh" event={"ID":"16841f85-d2fc-46a7-8c09-c957cb2cfb4f","Type":"ContainerStarted","Data":"5d5b26c93936947f60e91151617eb59d18a11b0b21c77d50933a822d729d673a"} Mar 07 21:40:58.745502 master-0 kubenswrapper[16352]: I0307 21:40:58.745409 16352 generic.go:334] "Generic (PLEG): container finished" podID="dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" containerID="a1e576e932db5e2f3c69c6aad98cf8f3e15fcea20e6c94144b9af8de58570c1c" exitCode=0 Mar 07 21:40:58.745502 master-0 kubenswrapper[16352]: I0307 21:40:58.745448 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-326e-account-create-update-g8dbq" event={"ID":"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c","Type":"ContainerDied","Data":"a1e576e932db5e2f3c69c6aad98cf8f3e15fcea20e6c94144b9af8de58570c1c"} Mar 07 21:40:58.745502 master-0 kubenswrapper[16352]: I0307 21:40:58.745466 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-326e-account-create-update-g8dbq" event={"ID":"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c","Type":"ContainerStarted","Data":"2a5766a923eff36357e615ca4b5984fdf3cc1734be73fa7ff0efe3c29e7cfee6"} Mar 07 21:40:58.750546 master-0 kubenswrapper[16352]: I0307 21:40:58.750457 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3631-account-create-update-8m8jf" event={"ID":"842cfecc-7700-4a61-a685-c211db763dcb","Type":"ContainerStarted","Data":"a6e2a25b80eca6e549dfa1628abd2c09396150183fe670303541bdccbe0f9072"} Mar 07 21:40:58.750546 master-0 kubenswrapper[16352]: I0307 21:40:58.750540 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3631-account-create-update-8m8jf" event={"ID":"842cfecc-7700-4a61-a685-c211db763dcb","Type":"ContainerStarted","Data":"ba4c73beb7ef08d0b9908a7edfbd650647bcd25a8c5a32c4c1c09b0df9ed7233"} Mar 07 21:40:58.771773 master-0 kubenswrapper[16352]: I0307 21:40:58.770523 16352 generic.go:334] "Generic (PLEG): container finished" podID="7b1e7528-a696-43e8-b487-d981fb460467" containerID="e6ec93a78a1f7a3aa6d6df7415329166122121d250cbbce9fea6fadde9efeca3" exitCode=0 Mar 07 21:40:58.771773 master-0 kubenswrapper[16352]: I0307 21:40:58.770667 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c490-account-create-update-rc6gq" event={"ID":"7b1e7528-a696-43e8-b487-d981fb460467","Type":"ContainerDied","Data":"e6ec93a78a1f7a3aa6d6df7415329166122121d250cbbce9fea6fadde9efeca3"} Mar 07 21:40:58.814190 master-0 kubenswrapper[16352]: I0307 21:40:58.813961 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-3631-account-create-update-8m8jf" podStartSLOduration=4.813930711 podStartE2EDuration="4.813930711s" podCreationTimestamp="2026-03-07 21:40:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:40:58.802342923 +0000 UTC m=+1381.873048022" watchObservedRunningTime="2026-03-07 21:40:58.813930711 +0000 UTC m=+1381.884635770" Mar 07 21:40:58.991001 master-0 kubenswrapper[16352]: I0307 21:40:58.990946 16352 scope.go:117] "RemoveContainer" containerID="235e1853e357a4236ad9c348a09263f93caab1a6f301f445d9ec7cdd6ed4d86d" Mar 07 21:40:59.003765 master-0 kubenswrapper[16352]: I0307 21:40:59.003060 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:59.016164 master-0 kubenswrapper[16352]: I0307 21:40:59.016105 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-58dc6c9559-pt84w"] Mar 07 21:40:59.208240 master-0 kubenswrapper[16352]: I0307 21:40:59.208165 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="167747ff-9ac4-42be-9894-549a51404415" path="/var/lib/kubelet/pods/167747ff-9ac4-42be-9894-549a51404415/volumes" Mar 07 21:40:59.209485 master-0 kubenswrapper[16352]: I0307 21:40:59.209443 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6" path="/var/lib/kubelet/pods/5464b2e2-8bd4-43e0-815a-9a1eeeb2b9a6/volumes" Mar 07 21:40:59.783423 master-0 kubenswrapper[16352]: I0307 21:40:59.783308 16352 generic.go:334] "Generic (PLEG): container finished" podID="842cfecc-7700-4a61-a685-c211db763dcb" containerID="a6e2a25b80eca6e549dfa1628abd2c09396150183fe670303541bdccbe0f9072" exitCode=0 Mar 07 21:40:59.784425 master-0 kubenswrapper[16352]: I0307 21:40:59.783433 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3631-account-create-update-8m8jf" event={"ID":"842cfecc-7700-4a61-a685-c211db763dcb","Type":"ContainerDied","Data":"a6e2a25b80eca6e549dfa1628abd2c09396150183fe670303541bdccbe0f9072"} Mar 07 21:40:59.787999 master-0 kubenswrapper[16352]: I0307 21:40:59.787937 16352 generic.go:334] "Generic (PLEG): container finished" podID="16841f85-d2fc-46a7-8c09-c957cb2cfb4f" containerID="d05854d3fa0625cdd2ed64a244c92ef91acfa81fc8d83d589e8979a8f3bbb33b" exitCode=0 Mar 07 21:40:59.788278 master-0 kubenswrapper[16352]: I0307 21:40:59.788241 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shqgh" event={"ID":"16841f85-d2fc-46a7-8c09-c957cb2cfb4f","Type":"ContainerDied","Data":"d05854d3fa0625cdd2ed64a244c92ef91acfa81fc8d83d589e8979a8f3bbb33b"} Mar 07 21:41:00.543060 master-0 kubenswrapper[16352]: I0307 21:41:00.542635 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shqgh" Mar 07 21:41:00.575257 master-0 kubenswrapper[16352]: I0307 21:41:00.573939 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts\") pod \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " Mar 07 21:41:00.575257 master-0 kubenswrapper[16352]: I0307 21:41:00.574285 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b6l9l\" (UniqueName: \"kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l\") pod \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\" (UID: \"16841f85-d2fc-46a7-8c09-c957cb2cfb4f\") " Mar 07 21:41:00.575592 master-0 kubenswrapper[16352]: I0307 21:41:00.575325 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "16841f85-d2fc-46a7-8c09-c957cb2cfb4f" (UID: "16841f85-d2fc-46a7-8c09-c957cb2cfb4f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:00.619279 master-0 kubenswrapper[16352]: I0307 21:41:00.619187 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l" (OuterVolumeSpecName: "kube-api-access-b6l9l") pod "16841f85-d2fc-46a7-8c09-c957cb2cfb4f" (UID: "16841f85-d2fc-46a7-8c09-c957cb2cfb4f"). InnerVolumeSpecName "kube-api-access-b6l9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:00.697319 master-0 kubenswrapper[16352]: I0307 21:41:00.697232 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:00.697319 master-0 kubenswrapper[16352]: I0307 21:41:00.697296 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b6l9l\" (UniqueName: \"kubernetes.io/projected/16841f85-d2fc-46a7-8c09-c957cb2cfb4f-kube-api-access-b6l9l\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:00.762860 master-0 kubenswrapper[16352]: I0307 21:41:00.762756 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 07 21:41:00.863674 master-0 kubenswrapper[16352]: I0307 21:41:00.863641 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-shqgh" Mar 07 21:41:00.870892 master-0 kubenswrapper[16352]: I0307 21:41:00.870799 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-shqgh" event={"ID":"16841f85-d2fc-46a7-8c09-c957cb2cfb4f","Type":"ContainerDied","Data":"5d5b26c93936947f60e91151617eb59d18a11b0b21c77d50933a822d729d673a"} Mar 07 21:41:00.870987 master-0 kubenswrapper[16352]: I0307 21:41:00.870915 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5d5b26c93936947f60e91151617eb59d18a11b0b21c77d50933a822d729d673a" Mar 07 21:41:00.917339 master-0 kubenswrapper[16352]: I0307 21:41:00.917273 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8jd9b" Mar 07 21:41:00.922588 master-0 kubenswrapper[16352]: I0307 21:41:00.922534 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:41:00.931803 master-0 kubenswrapper[16352]: I0307 21:41:00.931758 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8tkd" Mar 07 21:41:00.949866 master-0 kubenswrapper[16352]: I0307 21:41:00.947215 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008228 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts\") pod \"7b1e7528-a696-43e8-b487-d981fb460467\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008413 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts\") pod \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008474 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts\") pod \"732d7db0-01b4-4187-92d9-01e6a04f91f8\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008502 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdhmw\" (UniqueName: \"kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw\") pod \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\" (UID: \"5585c07c-5a03-46df-a14b-0cd2550e7ca1\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008557 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts\") pod \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008584 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2cqk7\" (UniqueName: \"kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7\") pod \"732d7db0-01b4-4187-92d9-01e6a04f91f8\" (UID: \"732d7db0-01b4-4187-92d9-01e6a04f91f8\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008669 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7px9h\" (UniqueName: \"kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h\") pod \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\" (UID: \"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.008792 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5wcvd\" (UniqueName: \"kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd\") pod \"7b1e7528-a696-43e8-b487-d981fb460467\" (UID: \"7b1e7528-a696-43e8-b487-d981fb460467\") " Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.009286 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "732d7db0-01b4-4187-92d9-01e6a04f91f8" (UID: "732d7db0-01b4-4187-92d9-01e6a04f91f8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.009304 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" (UID: "dc6fea59-4e62-414e-a8c0-d6c6a60fb72c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:01.009758 master-0 kubenswrapper[16352]: I0307 21:41:01.009401 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/732d7db0-01b4-4187-92d9-01e6a04f91f8-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.010412 master-0 kubenswrapper[16352]: I0307 21:41:01.009872 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7b1e7528-a696-43e8-b487-d981fb460467" (UID: "7b1e7528-a696-43e8-b487-d981fb460467"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:01.014076 master-0 kubenswrapper[16352]: I0307 21:41:01.014021 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5585c07c-5a03-46df-a14b-0cd2550e7ca1" (UID: "5585c07c-5a03-46df-a14b-0cd2550e7ca1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:01.019316 master-0 kubenswrapper[16352]: I0307 21:41:01.019247 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd" (OuterVolumeSpecName: "kube-api-access-5wcvd") pod "7b1e7528-a696-43e8-b487-d981fb460467" (UID: "7b1e7528-a696-43e8-b487-d981fb460467"). InnerVolumeSpecName "kube-api-access-5wcvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:01.023025 master-0 kubenswrapper[16352]: I0307 21:41:01.022958 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7" (OuterVolumeSpecName: "kube-api-access-2cqk7") pod "732d7db0-01b4-4187-92d9-01e6a04f91f8" (UID: "732d7db0-01b4-4187-92d9-01e6a04f91f8"). InnerVolumeSpecName "kube-api-access-2cqk7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:01.023962 master-0 kubenswrapper[16352]: I0307 21:41:01.023913 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h" (OuterVolumeSpecName: "kube-api-access-7px9h") pod "dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" (UID: "dc6fea59-4e62-414e-a8c0-d6c6a60fb72c"). InnerVolumeSpecName "kube-api-access-7px9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:01.037084 master-0 kubenswrapper[16352]: I0307 21:41:01.037015 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw" (OuterVolumeSpecName: "kube-api-access-rdhmw") pod "5585c07c-5a03-46df-a14b-0cd2550e7ca1" (UID: "5585c07c-5a03-46df-a14b-0cd2550e7ca1"). InnerVolumeSpecName "kube-api-access-rdhmw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:01.112315 master-0 kubenswrapper[16352]: I0307 21:41:01.112020 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5585c07c-5a03-46df-a14b-0cd2550e7ca1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112315 master-0 kubenswrapper[16352]: I0307 21:41:01.112317 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rdhmw\" (UniqueName: \"kubernetes.io/projected/5585c07c-5a03-46df-a14b-0cd2550e7ca1-kube-api-access-rdhmw\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112315 master-0 kubenswrapper[16352]: I0307 21:41:01.112334 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112315 master-0 kubenswrapper[16352]: I0307 21:41:01.112344 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2cqk7\" (UniqueName: \"kubernetes.io/projected/732d7db0-01b4-4187-92d9-01e6a04f91f8-kube-api-access-2cqk7\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112315 master-0 kubenswrapper[16352]: I0307 21:41:01.112358 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7px9h\" (UniqueName: \"kubernetes.io/projected/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c-kube-api-access-7px9h\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112796 master-0 kubenswrapper[16352]: I0307 21:41:01.112370 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5wcvd\" (UniqueName: \"kubernetes.io/projected/7b1e7528-a696-43e8-b487-d981fb460467-kube-api-access-5wcvd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.112796 master-0 kubenswrapper[16352]: I0307 21:41:01.112381 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7b1e7528-a696-43e8-b487-d981fb460467-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.397488 master-0 kubenswrapper[16352]: I0307 21:41:01.397427 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:41:01.524024 master-0 kubenswrapper[16352]: I0307 21:41:01.523931 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts\") pod \"842cfecc-7700-4a61-a685-c211db763dcb\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " Mar 07 21:41:01.524305 master-0 kubenswrapper[16352]: I0307 21:41:01.524193 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfnts\" (UniqueName: \"kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts\") pod \"842cfecc-7700-4a61-a685-c211db763dcb\" (UID: \"842cfecc-7700-4a61-a685-c211db763dcb\") " Mar 07 21:41:01.525987 master-0 kubenswrapper[16352]: I0307 21:41:01.525929 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "842cfecc-7700-4a61-a685-c211db763dcb" (UID: "842cfecc-7700-4a61-a685-c211db763dcb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:01.528845 master-0 kubenswrapper[16352]: I0307 21:41:01.528787 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts" (OuterVolumeSpecName: "kube-api-access-qfnts") pod "842cfecc-7700-4a61-a685-c211db763dcb" (UID: "842cfecc-7700-4a61-a685-c211db763dcb"). InnerVolumeSpecName "kube-api-access-qfnts". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:01.630251 master-0 kubenswrapper[16352]: I0307 21:41:01.630177 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/842cfecc-7700-4a61-a685-c211db763dcb-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.630251 master-0 kubenswrapper[16352]: I0307 21:41:01.630241 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfnts\" (UniqueName: \"kubernetes.io/projected/842cfecc-7700-4a61-a685-c211db763dcb-kube-api-access-qfnts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:01.877926 master-0 kubenswrapper[16352]: I0307 21:41:01.877851 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-c490-account-create-update-rc6gq" Mar 07 21:41:01.879065 master-0 kubenswrapper[16352]: I0307 21:41:01.879012 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-c490-account-create-update-rc6gq" event={"ID":"7b1e7528-a696-43e8-b487-d981fb460467","Type":"ContainerDied","Data":"350aef14ef463f18b5d495616832ae0176314ab9386071aca57c3b39bd748a01"} Mar 07 21:41:01.879065 master-0 kubenswrapper[16352]: I0307 21:41:01.879059 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="350aef14ef463f18b5d495616832ae0176314ab9386071aca57c3b39bd748a01" Mar 07 21:41:01.881575 master-0 kubenswrapper[16352]: I0307 21:41:01.881501 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-q8tkd" event={"ID":"732d7db0-01b4-4187-92d9-01e6a04f91f8","Type":"ContainerDied","Data":"a0073b97ffad389b0581863ce12e3a49cf544d7fc2be66230ca596c99368f692"} Mar 07 21:41:01.881575 master-0 kubenswrapper[16352]: I0307 21:41:01.881557 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0073b97ffad389b0581863ce12e3a49cf544d7fc2be66230ca596c99368f692" Mar 07 21:41:01.881745 master-0 kubenswrapper[16352]: I0307 21:41:01.881634 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-q8tkd" Mar 07 21:41:01.886381 master-0 kubenswrapper[16352]: I0307 21:41:01.886310 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-8jd9b" Mar 07 21:41:01.886508 master-0 kubenswrapper[16352]: I0307 21:41:01.886439 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-8jd9b" event={"ID":"5585c07c-5a03-46df-a14b-0cd2550e7ca1","Type":"ContainerDied","Data":"d7e0d620714211e3b1ccdb93fea78d7f078e653786a1dda900a18e779558e3cc"} Mar 07 21:41:01.886565 master-0 kubenswrapper[16352]: I0307 21:41:01.886511 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d7e0d620714211e3b1ccdb93fea78d7f078e653786a1dda900a18e779558e3cc" Mar 07 21:41:01.888990 master-0 kubenswrapper[16352]: I0307 21:41:01.888930 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-326e-account-create-update-g8dbq" event={"ID":"dc6fea59-4e62-414e-a8c0-d6c6a60fb72c","Type":"ContainerDied","Data":"2a5766a923eff36357e615ca4b5984fdf3cc1734be73fa7ff0efe3c29e7cfee6"} Mar 07 21:41:01.888990 master-0 kubenswrapper[16352]: I0307 21:41:01.888964 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2a5766a923eff36357e615ca4b5984fdf3cc1734be73fa7ff0efe3c29e7cfee6" Mar 07 21:41:01.889105 master-0 kubenswrapper[16352]: I0307 21:41:01.889034 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-326e-account-create-update-g8dbq" Mar 07 21:41:01.892276 master-0 kubenswrapper[16352]: I0307 21:41:01.892233 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-3631-account-create-update-8m8jf" Mar 07 21:41:01.892832 master-0 kubenswrapper[16352]: I0307 21:41:01.892211 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-3631-account-create-update-8m8jf" event={"ID":"842cfecc-7700-4a61-a685-c211db763dcb","Type":"ContainerDied","Data":"ba4c73beb7ef08d0b9908a7edfbd650647bcd25a8c5a32c4c1c09b0df9ed7233"} Mar 07 21:41:01.892832 master-0 kubenswrapper[16352]: I0307 21:41:01.892389 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba4c73beb7ef08d0b9908a7edfbd650647bcd25a8c5a32c4c1c09b0df9ed7233" Mar 07 21:41:02.219509 master-0 kubenswrapper[16352]: I0307 21:41:02.219345 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-m995r"] Mar 07 21:41:02.220086 master-0 kubenswrapper[16352]: E0307 21:41:02.220049 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5585c07c-5a03-46df-a14b-0cd2550e7ca1" containerName="mariadb-database-create" Mar 07 21:41:02.220086 master-0 kubenswrapper[16352]: I0307 21:41:02.220072 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5585c07c-5a03-46df-a14b-0cd2550e7ca1" containerName="mariadb-database-create" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: E0307 21:41:02.220107 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="dnsmasq-dns" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: I0307 21:41:02.220115 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="dnsmasq-dns" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: E0307 21:41:02.220138 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16841f85-d2fc-46a7-8c09-c957cb2cfb4f" containerName="mariadb-database-create" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: I0307 21:41:02.220149 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="16841f85-d2fc-46a7-8c09-c957cb2cfb4f" containerName="mariadb-database-create" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: E0307 21:41:02.220160 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b1e7528-a696-43e8-b487-d981fb460467" containerName="mariadb-account-create-update" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: I0307 21:41:02.220169 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b1e7528-a696-43e8-b487-d981fb460467" containerName="mariadb-account-create-update" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: E0307 21:41:02.220178 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" containerName="mariadb-account-create-update" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: I0307 21:41:02.220186 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" containerName="mariadb-account-create-update" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: E0307 21:41:02.220202 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="init" Mar 07 21:41:02.220208 master-0 kubenswrapper[16352]: I0307 21:41:02.220209 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="init" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: E0307 21:41:02.220229 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="842cfecc-7700-4a61-a685-c211db763dcb" containerName="mariadb-account-create-update" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220238 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="842cfecc-7700-4a61-a685-c211db763dcb" containerName="mariadb-account-create-update" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: E0307 21:41:02.220263 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="732d7db0-01b4-4187-92d9-01e6a04f91f8" containerName="mariadb-database-create" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220269 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="732d7db0-01b4-4187-92d9-01e6a04f91f8" containerName="mariadb-database-create" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220500 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5585c07c-5a03-46df-a14b-0cd2550e7ca1" containerName="mariadb-database-create" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220530 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" containerName="mariadb-account-create-update" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220539 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="842cfecc-7700-4a61-a685-c211db763dcb" containerName="mariadb-account-create-update" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220550 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="732d7db0-01b4-4187-92d9-01e6a04f91f8" containerName="mariadb-database-create" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220567 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="167747ff-9ac4-42be-9894-549a51404415" containerName="dnsmasq-dns" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220583 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="16841f85-d2fc-46a7-8c09-c957cb2cfb4f" containerName="mariadb-database-create" Mar 07 21:41:02.220618 master-0 kubenswrapper[16352]: I0307 21:41:02.220598 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b1e7528-a696-43e8-b487-d981fb460467" containerName="mariadb-account-create-update" Mar 07 21:41:02.221482 master-0 kubenswrapper[16352]: I0307 21:41:02.221454 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.223814 master-0 kubenswrapper[16352]: I0307 21:41:02.223740 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 07 21:41:02.235116 master-0 kubenswrapper[16352]: I0307 21:41:02.235044 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m995r"] Mar 07 21:41:02.351726 master-0 kubenswrapper[16352]: I0307 21:41:02.351621 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gn8nv\" (UniqueName: \"kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.352089 master-0 kubenswrapper[16352]: I0307 21:41:02.351753 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.459544 master-0 kubenswrapper[16352]: I0307 21:41:02.459421 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gn8nv\" (UniqueName: \"kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.459929 master-0 kubenswrapper[16352]: I0307 21:41:02.459578 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.460520 master-0 kubenswrapper[16352]: I0307 21:41:02.460460 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.491464 master-0 kubenswrapper[16352]: I0307 21:41:02.491289 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gn8nv\" (UniqueName: \"kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv\") pod \"root-account-create-update-m995r\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " pod="openstack/root-account-create-update-m995r" Mar 07 21:41:02.551783 master-0 kubenswrapper[16352]: I0307 21:41:02.551632 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m995r" Mar 07 21:41:03.063048 master-0 kubenswrapper[16352]: W0307 21:41:03.062977 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ae64be7_df95_407d_bcd8_732b98e9df90.slice/crio-1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0 WatchSource:0}: Error finding container 1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0: Status 404 returned error can't find the container with id 1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0 Mar 07 21:41:03.073997 master-0 kubenswrapper[16352]: I0307 21:41:03.073912 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-m995r"] Mar 07 21:41:03.958118 master-0 kubenswrapper[16352]: I0307 21:41:03.958056 16352 generic.go:334] "Generic (PLEG): container finished" podID="2ae64be7-df95-407d-bcd8-732b98e9df90" containerID="e56be3b2b266c11a944e77332dce1c1cf807a1b6fe3a182e779e02a1245f5cd2" exitCode=0 Mar 07 21:41:03.958364 master-0 kubenswrapper[16352]: I0307 21:41:03.958127 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m995r" event={"ID":"2ae64be7-df95-407d-bcd8-732b98e9df90","Type":"ContainerDied","Data":"e56be3b2b266c11a944e77332dce1c1cf807a1b6fe3a182e779e02a1245f5cd2"} Mar 07 21:41:03.958364 master-0 kubenswrapper[16352]: I0307 21:41:03.958157 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m995r" event={"ID":"2ae64be7-df95-407d-bcd8-732b98e9df90","Type":"ContainerStarted","Data":"1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0"} Mar 07 21:41:04.302930 master-0 kubenswrapper[16352]: I0307 21:41:04.302741 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:41:04.312084 master-0 kubenswrapper[16352]: I0307 21:41:04.312008 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62-etc-swift\") pod \"swift-storage-0\" (UID: \"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62\") " pod="openstack/swift-storage-0" Mar 07 21:41:04.582994 master-0 kubenswrapper[16352]: I0307 21:41:04.582668 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 07 21:41:04.744822 master-0 kubenswrapper[16352]: I0307 21:41:04.741218 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-s9668"] Mar 07 21:41:04.744822 master-0 kubenswrapper[16352]: I0307 21:41:04.743759 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.748178 master-0 kubenswrapper[16352]: I0307 21:41:04.747152 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-config-data" Mar 07 21:41:04.758819 master-0 kubenswrapper[16352]: I0307 21:41:04.758753 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s9668"] Mar 07 21:41:04.820803 master-0 kubenswrapper[16352]: I0307 21:41:04.817929 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcq7j\" (UniqueName: \"kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.820803 master-0 kubenswrapper[16352]: I0307 21:41:04.818561 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.820803 master-0 kubenswrapper[16352]: I0307 21:41:04.818774 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.820803 master-0 kubenswrapper[16352]: I0307 21:41:04.818826 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.920769 master-0 kubenswrapper[16352]: I0307 21:41:04.920663 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.921073 master-0 kubenswrapper[16352]: I0307 21:41:04.920820 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.921248 master-0 kubenswrapper[16352]: I0307 21:41:04.921132 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.921405 master-0 kubenswrapper[16352]: I0307 21:41:04.921367 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcq7j\" (UniqueName: \"kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.927957 master-0 kubenswrapper[16352]: I0307 21:41:04.925218 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.927957 master-0 kubenswrapper[16352]: I0307 21:41:04.926729 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.938867 master-0 kubenswrapper[16352]: I0307 21:41:04.938594 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.943698 master-0 kubenswrapper[16352]: I0307 21:41:04.943609 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcq7j\" (UniqueName: \"kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j\") pod \"glance-db-sync-s9668\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " pod="openstack/glance-db-sync-s9668" Mar 07 21:41:04.990857 master-0 kubenswrapper[16352]: I0307 21:41:04.990111 16352 generic.go:334] "Generic (PLEG): container finished" podID="e74c1a05-5f07-4375-99f3-8f0db281543b" containerID="51b420456f8034f930715129b10530859cb6a93859a69b9be2aacbae1a0a2dc7" exitCode=0 Mar 07 21:41:04.990857 master-0 kubenswrapper[16352]: I0307 21:41:04.990200 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-796n4" event={"ID":"e74c1a05-5f07-4375-99f3-8f0db281543b","Type":"ContainerDied","Data":"51b420456f8034f930715129b10530859cb6a93859a69b9be2aacbae1a0a2dc7"} Mar 07 21:41:05.102453 master-0 kubenswrapper[16352]: I0307 21:41:05.102372 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s9668" Mar 07 21:41:05.185069 master-0 kubenswrapper[16352]: W0307 21:41:05.184995 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc5e9bdc6_41f4_4211_8bb2_e6eb1193bf62.slice/crio-aaabaf5b11dd78ac794fe0db313cbfee4271e7f88f21a0cdde0d6312f8c9b72a WatchSource:0}: Error finding container aaabaf5b11dd78ac794fe0db313cbfee4271e7f88f21a0cdde0d6312f8c9b72a: Status 404 returned error can't find the container with id aaabaf5b11dd78ac794fe0db313cbfee4271e7f88f21a0cdde0d6312f8c9b72a Mar 07 21:41:05.207325 master-0 kubenswrapper[16352]: I0307 21:41:05.207248 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 07 21:41:05.557889 master-0 kubenswrapper[16352]: I0307 21:41:05.557836 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m995r" Mar 07 21:41:05.648741 master-0 kubenswrapper[16352]: I0307 21:41:05.648640 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts\") pod \"2ae64be7-df95-407d-bcd8-732b98e9df90\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " Mar 07 21:41:05.649011 master-0 kubenswrapper[16352]: I0307 21:41:05.648760 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gn8nv\" (UniqueName: \"kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv\") pod \"2ae64be7-df95-407d-bcd8-732b98e9df90\" (UID: \"2ae64be7-df95-407d-bcd8-732b98e9df90\") " Mar 07 21:41:05.650558 master-0 kubenswrapper[16352]: I0307 21:41:05.650481 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2ae64be7-df95-407d-bcd8-732b98e9df90" (UID: "2ae64be7-df95-407d-bcd8-732b98e9df90"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:05.654906 master-0 kubenswrapper[16352]: I0307 21:41:05.654786 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2ae64be7-df95-407d-bcd8-732b98e9df90-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:05.659092 master-0 kubenswrapper[16352]: I0307 21:41:05.658660 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv" (OuterVolumeSpecName: "kube-api-access-gn8nv") pod "2ae64be7-df95-407d-bcd8-732b98e9df90" (UID: "2ae64be7-df95-407d-bcd8-732b98e9df90"). InnerVolumeSpecName "kube-api-access-gn8nv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:05.765093 master-0 kubenswrapper[16352]: I0307 21:41:05.764837 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gn8nv\" (UniqueName: \"kubernetes.io/projected/2ae64be7-df95-407d-bcd8-732b98e9df90-kube-api-access-gn8nv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:05.774836 master-0 kubenswrapper[16352]: I0307 21:41:05.774668 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-s9668"] Mar 07 21:41:06.034670 master-0 kubenswrapper[16352]: I0307 21:41:06.034574 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-m995r" event={"ID":"2ae64be7-df95-407d-bcd8-732b98e9df90","Type":"ContainerDied","Data":"1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0"} Mar 07 21:41:06.034956 master-0 kubenswrapper[16352]: I0307 21:41:06.034591 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-m995r" Mar 07 21:41:06.035080 master-0 kubenswrapper[16352]: I0307 21:41:06.034932 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b094ab305a80829982becf2bf538f675a286e9137bca06101d30827e3c51bf0" Mar 07 21:41:06.038614 master-0 kubenswrapper[16352]: I0307 21:41:06.038577 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"aaabaf5b11dd78ac794fe0db313cbfee4271e7f88f21a0cdde0d6312f8c9b72a"} Mar 07 21:41:06.041828 master-0 kubenswrapper[16352]: I0307 21:41:06.041743 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s9668" event={"ID":"c53b3432-7649-440a-b109-bb48be9f10c7","Type":"ContainerStarted","Data":"722c174453d42d2d76049c92375dd65316a0792ae745900c1ff848f26156bc0b"} Mar 07 21:41:06.431392 master-0 kubenswrapper[16352]: I0307 21:41:06.431324 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:41:06.482933 master-0 kubenswrapper[16352]: I0307 21:41:06.482455 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.482933 master-0 kubenswrapper[16352]: I0307 21:41:06.482644 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9r2vt\" (UniqueName: \"kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.482933 master-0 kubenswrapper[16352]: I0307 21:41:06.482748 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.484885 master-0 kubenswrapper[16352]: I0307 21:41:06.483414 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.484885 master-0 kubenswrapper[16352]: I0307 21:41:06.483483 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.484885 master-0 kubenswrapper[16352]: I0307 21:41:06.483511 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.484885 master-0 kubenswrapper[16352]: I0307 21:41:06.483602 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf\") pod \"e74c1a05-5f07-4375-99f3-8f0db281543b\" (UID: \"e74c1a05-5f07-4375-99f3-8f0db281543b\") " Mar 07 21:41:06.487162 master-0 kubenswrapper[16352]: I0307 21:41:06.487122 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:06.487251 master-0 kubenswrapper[16352]: I0307 21:41:06.487145 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:41:06.497088 master-0 kubenswrapper[16352]: I0307 21:41:06.496957 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt" (OuterVolumeSpecName: "kube-api-access-9r2vt") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "kube-api-access-9r2vt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:06.516019 master-0 kubenswrapper[16352]: I0307 21:41:06.515891 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:06.545371 master-0 kubenswrapper[16352]: I0307 21:41:06.545253 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:06.552009 master-0 kubenswrapper[16352]: I0307 21:41:06.551820 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:06.553070 master-0 kubenswrapper[16352]: I0307 21:41:06.552857 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts" (OuterVolumeSpecName: "scripts") pod "e74c1a05-5f07-4375-99f3-8f0db281543b" (UID: "e74c1a05-5f07-4375-99f3-8f0db281543b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590445 16352 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-dispersionconf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590515 16352 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-ring-data-devices\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590530 16352 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/e74c1a05-5f07-4375-99f3-8f0db281543b-etc-swift\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590543 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590556 16352 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/e74c1a05-5f07-4375-99f3-8f0db281543b-swiftconf\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590568 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e74c1a05-5f07-4375-99f3-8f0db281543b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.590715 master-0 kubenswrapper[16352]: I0307 21:41:06.590582 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9r2vt\" (UniqueName: \"kubernetes.io/projected/e74c1a05-5f07-4375-99f3-8f0db281543b-kube-api-access-9r2vt\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:06.868570 master-0 kubenswrapper[16352]: I0307 21:41:06.868445 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wptpb" podUID="59890fa2-937c-4fc1-9f3d-6c2297a9d46b" containerName="ovn-controller" probeResult="failure" output=< Mar 07 21:41:06.868570 master-0 kubenswrapper[16352]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 21:41:06.868570 master-0 kubenswrapper[16352]: > Mar 07 21:41:06.902923 master-0 kubenswrapper[16352]: I0307 21:41:06.902839 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:41:07.061632 master-0 kubenswrapper[16352]: I0307 21:41:07.061568 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-796n4" event={"ID":"e74c1a05-5f07-4375-99f3-8f0db281543b","Type":"ContainerDied","Data":"1b521828b4f8a411551598d711446fb74f76411b7f8e82d58e1125beecaf61bd"} Mar 07 21:41:07.061632 master-0 kubenswrapper[16352]: I0307 21:41:07.061632 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1b521828b4f8a411551598d711446fb74f76411b7f8e82d58e1125beecaf61bd" Mar 07 21:41:07.061914 master-0 kubenswrapper[16352]: I0307 21:41:07.061633 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-796n4" Mar 07 21:41:07.067845 master-0 kubenswrapper[16352]: I0307 21:41:07.067789 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"296b1ad4dbf431ac62813703754a6bb2d772c0b8d439ac44490e5379afd6926a"} Mar 07 21:41:07.067845 master-0 kubenswrapper[16352]: I0307 21:41:07.067846 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"04310b4b859cd972d0d8349ae73e9c0b96582347831a5dee73238b88038d1ebb"} Mar 07 21:41:07.067845 master-0 kubenswrapper[16352]: I0307 21:41:07.067858 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"bb4c1e1e412dbdcc0e61c54575aa2f8d83cd210dc4e3206406b53204d91aaf42"} Mar 07 21:41:08.083405 master-0 kubenswrapper[16352]: I0307 21:41:08.083338 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"2734cc249967cfb9959f0843d66ce0ab3061db23e2453631b8408597df7ed59e"} Mar 07 21:41:09.108477 master-0 kubenswrapper[16352]: I0307 21:41:09.108425 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"682abbcc7cf7f8dd63cd50118d99804b4c6fc0e80f3c4790e0a52e4d43dfe6e9"} Mar 07 21:41:09.108933 master-0 kubenswrapper[16352]: I0307 21:41:09.108488 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"edfd72e15f41885f8a6d37ecdbb7b59296b0d77952eb75ebfaff32c0c3a3b3ad"} Mar 07 21:41:09.112500 master-0 kubenswrapper[16352]: I0307 21:41:09.112464 16352 generic.go:334] "Generic (PLEG): container finished" podID="4d2620a2-19d9-4543-922c-dc7951734958" containerID="831b8bfe86eef604f21234ea36051d147718c53685172c2c1b6643f3cbe45146" exitCode=0 Mar 07 21:41:09.112619 master-0 kubenswrapper[16352]: I0307 21:41:09.112528 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4d2620a2-19d9-4543-922c-dc7951734958","Type":"ContainerDied","Data":"831b8bfe86eef604f21234ea36051d147718c53685172c2c1b6643f3cbe45146"} Mar 07 21:41:09.121233 master-0 kubenswrapper[16352]: I0307 21:41:09.121150 16352 generic.go:334] "Generic (PLEG): container finished" podID="6f025883-7fbd-4887-9328-36ba8b9c326b" containerID="430f1bf4eec06fab444c5588005d35eb7be06654e63c8278b5926d996bb92429" exitCode=0 Mar 07 21:41:09.121233 master-0 kubenswrapper[16352]: I0307 21:41:09.121226 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f025883-7fbd-4887-9328-36ba8b9c326b","Type":"ContainerDied","Data":"430f1bf4eec06fab444c5588005d35eb7be06654e63c8278b5926d996bb92429"} Mar 07 21:41:10.143480 master-0 kubenswrapper[16352]: I0307 21:41:10.143359 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"874dc8f50c1dd7be8edcc4da4ca5b37bed69a597750fb1cef5b4db142cd5fb0e"} Mar 07 21:41:10.143480 master-0 kubenswrapper[16352]: I0307 21:41:10.143467 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"67130e81c9a5a078d938a73819e1d824835e96fbb157d3eb7e77da47083ee7df"} Mar 07 21:41:10.146276 master-0 kubenswrapper[16352]: I0307 21:41:10.146235 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"4d2620a2-19d9-4543-922c-dc7951734958","Type":"ContainerStarted","Data":"990d2e3ceda504a3fa016485818e165f4d2150029547487d2659343725af6d33"} Mar 07 21:41:10.147294 master-0 kubenswrapper[16352]: I0307 21:41:10.147186 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 07 21:41:10.151001 master-0 kubenswrapper[16352]: I0307 21:41:10.150956 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"6f025883-7fbd-4887-9328-36ba8b9c326b","Type":"ContainerStarted","Data":"f43680f650962af04ff17f36fcbb4fd11a64f04353637b7f64601b72b0f2a542"} Mar 07 21:41:10.151573 master-0 kubenswrapper[16352]: I0307 21:41:10.151530 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:41:10.198221 master-0 kubenswrapper[16352]: I0307 21:41:10.198069 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=57.069389813 podStartE2EDuration="1m11.198040235s" podCreationTimestamp="2026-03-07 21:39:59 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.428162877 +0000 UTC m=+1342.498867936" lastFinishedPulling="2026-03-07 21:40:33.556813299 +0000 UTC m=+1356.627518358" observedRunningTime="2026-03-07 21:41:10.184959202 +0000 UTC m=+1393.255664251" watchObservedRunningTime="2026-03-07 21:41:10.198040235 +0000 UTC m=+1393.268745304" Mar 07 21:41:10.218471 master-0 kubenswrapper[16352]: I0307 21:41:10.218271 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=56.637235991 podStartE2EDuration="1m11.218245481s" podCreationTimestamp="2026-03-07 21:39:59 +0000 UTC" firstStartedPulling="2026-03-07 21:40:19.428846053 +0000 UTC m=+1342.499551112" lastFinishedPulling="2026-03-07 21:40:34.009855523 +0000 UTC m=+1357.080560602" observedRunningTime="2026-03-07 21:41:10.212846821 +0000 UTC m=+1393.283551880" watchObservedRunningTime="2026-03-07 21:41:10.218245481 +0000 UTC m=+1393.288950550" Mar 07 21:41:11.177942 master-0 kubenswrapper[16352]: I0307 21:41:11.177822 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"bf71605bc9ae6f62d69dadf304345c7cc33ececf10136b9fe5a09a33455ec87d"} Mar 07 21:41:11.844210 master-0 kubenswrapper[16352]: I0307 21:41:11.844122 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wptpb" podUID="59890fa2-937c-4fc1-9f3d-6c2297a9d46b" containerName="ovn-controller" probeResult="failure" output=< Mar 07 21:41:11.844210 master-0 kubenswrapper[16352]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 21:41:11.844210 master-0 kubenswrapper[16352]: > Mar 07 21:41:11.927075 master-0 kubenswrapper[16352]: I0307 21:41:11.926973 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-csxfx" Mar 07 21:41:12.204826 master-0 kubenswrapper[16352]: I0307 21:41:12.202227 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-wptpb-config-x2v29"] Mar 07 21:41:12.204826 master-0 kubenswrapper[16352]: E0307 21:41:12.203054 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e74c1a05-5f07-4375-99f3-8f0db281543b" containerName="swift-ring-rebalance" Mar 07 21:41:12.204826 master-0 kubenswrapper[16352]: I0307 21:41:12.203074 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e74c1a05-5f07-4375-99f3-8f0db281543b" containerName="swift-ring-rebalance" Mar 07 21:41:12.207727 master-0 kubenswrapper[16352]: E0307 21:41:12.207667 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ae64be7-df95-407d-bcd8-732b98e9df90" containerName="mariadb-account-create-update" Mar 07 21:41:12.207727 master-0 kubenswrapper[16352]: I0307 21:41:12.207717 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ae64be7-df95-407d-bcd8-732b98e9df90" containerName="mariadb-account-create-update" Mar 07 21:41:12.208236 master-0 kubenswrapper[16352]: I0307 21:41:12.208205 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ae64be7-df95-407d-bcd8-732b98e9df90" containerName="mariadb-account-create-update" Mar 07 21:41:12.208292 master-0 kubenswrapper[16352]: I0307 21:41:12.208252 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e74c1a05-5f07-4375-99f3-8f0db281543b" containerName="swift-ring-rebalance" Mar 07 21:41:12.209537 master-0 kubenswrapper[16352]: I0307 21:41:12.209502 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.216163 master-0 kubenswrapper[16352]: I0307 21:41:12.216094 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 07 21:41:12.223448 master-0 kubenswrapper[16352]: I0307 21:41:12.223373 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wptpb-config-x2v29"] Mar 07 21:41:12.237396 master-0 kubenswrapper[16352]: I0307 21:41:12.237336 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"ada6b38e4d5220ac54f9bbbaa0596da7c99727559acc314b2e582c0f1cb258b2"} Mar 07 21:41:12.237622 master-0 kubenswrapper[16352]: I0307 21:41:12.237602 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"aecffd23ebc293fa806b933bf28df6b2507647a4c6adc56f360323cdaf01bc98"} Mar 07 21:41:12.237727 master-0 kubenswrapper[16352]: I0307 21:41:12.237710 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"8a0260516ae5a9eacee509b4b6226df1bff9ea2692fbdb77cc97e10cca35ee21"} Mar 07 21:41:12.237807 master-0 kubenswrapper[16352]: I0307 21:41:12.237793 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"22232ec04884a6bcca39556dd10b1915b9abaa04248b620dd90bd3288731c356"} Mar 07 21:41:12.373190 master-0 kubenswrapper[16352]: I0307 21:41:12.373104 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.373602 master-0 kubenswrapper[16352]: I0307 21:41:12.373533 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.373961 master-0 kubenswrapper[16352]: I0307 21:41:12.373908 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.374389 master-0 kubenswrapper[16352]: I0307 21:41:12.374360 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nflwb\" (UniqueName: \"kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.374735 master-0 kubenswrapper[16352]: I0307 21:41:12.374666 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.374971 master-0 kubenswrapper[16352]: I0307 21:41:12.374943 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.478749 master-0 kubenswrapper[16352]: I0307 21:41:12.478559 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479023 master-0 kubenswrapper[16352]: I0307 21:41:12.478842 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479023 master-0 kubenswrapper[16352]: I0307 21:41:12.478896 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479129 master-0 kubenswrapper[16352]: I0307 21:41:12.479095 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nflwb\" (UniqueName: \"kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479194 master-0 kubenswrapper[16352]: I0307 21:41:12.479175 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479307 master-0 kubenswrapper[16352]: I0307 21:41:12.479278 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479495 master-0 kubenswrapper[16352]: I0307 21:41:12.479451 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479552 master-0 kubenswrapper[16352]: I0307 21:41:12.479530 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.479941 master-0 kubenswrapper[16352]: I0307 21:41:12.479905 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.480282 master-0 kubenswrapper[16352]: I0307 21:41:12.480233 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.482707 master-0 kubenswrapper[16352]: I0307 21:41:12.482649 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.503928 master-0 kubenswrapper[16352]: I0307 21:41:12.503863 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nflwb\" (UniqueName: \"kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb\") pod \"ovn-controller-wptpb-config-x2v29\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:12.543705 master-0 kubenswrapper[16352]: I0307 21:41:12.543620 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:16.867733 master-0 kubenswrapper[16352]: I0307 21:41:16.867612 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-wptpb" podUID="59890fa2-937c-4fc1-9f3d-6c2297a9d46b" containerName="ovn-controller" probeResult="failure" output=< Mar 07 21:41:16.867733 master-0 kubenswrapper[16352]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 07 21:41:16.867733 master-0 kubenswrapper[16352]: > Mar 07 21:41:20.662024 master-0 kubenswrapper[16352]: I0307 21:41:20.661898 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-wptpb-config-x2v29"] Mar 07 21:41:21.444271 master-0 kubenswrapper[16352]: I0307 21:41:21.444111 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s9668" event={"ID":"c53b3432-7649-440a-b109-bb48be9f10c7","Type":"ContainerStarted","Data":"26c2967c8a278140c4cf0ec5f5f1a8d674f8cc49abe5d0529afedb2063bf41c2"} Mar 07 21:41:21.451070 master-0 kubenswrapper[16352]: I0307 21:41:21.450991 16352 generic.go:334] "Generic (PLEG): container finished" podID="c3253e69-c364-492d-ba92-7847cd8db923" containerID="492bceae39af68bf199877a517d49ca13c19ec0d954cb2d9bd2b30513d6aaddf" exitCode=0 Mar 07 21:41:21.451293 master-0 kubenswrapper[16352]: I0307 21:41:21.451053 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wptpb-config-x2v29" event={"ID":"c3253e69-c364-492d-ba92-7847cd8db923","Type":"ContainerDied","Data":"492bceae39af68bf199877a517d49ca13c19ec0d954cb2d9bd2b30513d6aaddf"} Mar 07 21:41:21.451293 master-0 kubenswrapper[16352]: I0307 21:41:21.451145 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wptpb-config-x2v29" event={"ID":"c3253e69-c364-492d-ba92-7847cd8db923","Type":"ContainerStarted","Data":"48c53000e15989c224dee288f840e045cad1eb54710987d80a1d958b8c4454ed"} Mar 07 21:41:21.458888 master-0 kubenswrapper[16352]: I0307 21:41:21.458799 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"07e0545eaeadb7c98909796bbfd87e7fd1b01ce3b9a1834567acdaede3236225"} Mar 07 21:41:21.458981 master-0 kubenswrapper[16352]: I0307 21:41:21.458932 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"c5e9bdc6-41f4-4211-8bb2-e6eb1193bf62","Type":"ContainerStarted","Data":"ca27aaa5e011fbc7621e3164b0d3822145dbdcec24c92b3e146a7fb293d4a2aa"} Mar 07 21:41:21.469466 master-0 kubenswrapper[16352]: I0307 21:41:21.469380 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-s9668" podStartSLOduration=2.983514214 podStartE2EDuration="17.469356462s" podCreationTimestamp="2026-03-07 21:41:04 +0000 UTC" firstStartedPulling="2026-03-07 21:41:05.793405187 +0000 UTC m=+1388.864110246" lastFinishedPulling="2026-03-07 21:41:20.279247435 +0000 UTC m=+1403.349952494" observedRunningTime="2026-03-07 21:41:21.46638107 +0000 UTC m=+1404.537086139" watchObservedRunningTime="2026-03-07 21:41:21.469356462 +0000 UTC m=+1404.540061521" Mar 07 21:41:21.524887 master-0 kubenswrapper[16352]: I0307 21:41:21.524622 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=30.025323053 podStartE2EDuration="35.524590788s" podCreationTimestamp="2026-03-07 21:40:46 +0000 UTC" firstStartedPulling="2026-03-07 21:41:05.189021189 +0000 UTC m=+1388.259726268" lastFinishedPulling="2026-03-07 21:41:10.688288904 +0000 UTC m=+1393.758994003" observedRunningTime="2026-03-07 21:41:21.50885421 +0000 UTC m=+1404.579559289" watchObservedRunningTime="2026-03-07 21:41:21.524590788 +0000 UTC m=+1404.595295867" Mar 07 21:41:21.890696 master-0 kubenswrapper[16352]: I0307 21:41:21.888138 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:41:21.890696 master-0 kubenswrapper[16352]: I0307 21:41:21.890173 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.896250 master-0 kubenswrapper[16352]: I0307 21:41:21.895955 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 07 21:41:21.897962 master-0 kubenswrapper[16352]: I0307 21:41:21.897912 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-wptpb" Mar 07 21:41:21.920322 master-0 kubenswrapper[16352]: I0307 21:41:21.920250 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:41:21.952732 master-0 kubenswrapper[16352]: I0307 21:41:21.952624 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.953013 master-0 kubenswrapper[16352]: I0307 21:41:21.952839 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.953013 master-0 kubenswrapper[16352]: I0307 21:41:21.952978 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.953086 master-0 kubenswrapper[16352]: I0307 21:41:21.953069 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.953128 master-0 kubenswrapper[16352]: I0307 21:41:21.953096 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp9dz\" (UniqueName: \"kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:21.953231 master-0 kubenswrapper[16352]: I0307 21:41:21.953199 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.056126 master-0 kubenswrapper[16352]: I0307 21:41:22.055874 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.056126 master-0 kubenswrapper[16352]: I0307 21:41:22.056026 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.056472 master-0 kubenswrapper[16352]: I0307 21:41:22.056344 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.056792 master-0 kubenswrapper[16352]: I0307 21:41:22.056755 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.057031 master-0 kubenswrapper[16352]: I0307 21:41:22.056990 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.057110 master-0 kubenswrapper[16352]: I0307 21:41:22.057052 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp9dz\" (UniqueName: \"kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.057166 master-0 kubenswrapper[16352]: I0307 21:41:22.057118 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.057444 master-0 kubenswrapper[16352]: I0307 21:41:22.057411 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.057521 master-0 kubenswrapper[16352]: I0307 21:41:22.057429 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.058117 master-0 kubenswrapper[16352]: I0307 21:41:22.058083 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.058605 master-0 kubenswrapper[16352]: I0307 21:41:22.058562 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.081658 master-0 kubenswrapper[16352]: I0307 21:41:22.081580 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp9dz\" (UniqueName: \"kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz\") pod \"dnsmasq-dns-6465c5fc85-2kk4v\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.222677 master-0 kubenswrapper[16352]: I0307 21:41:22.222442 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:22.739412 master-0 kubenswrapper[16352]: I0307 21:41:22.739290 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:41:22.756073 master-0 kubenswrapper[16352]: W0307 21:41:22.756019 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podaae009f4_e625_45c1_a7b5_f8f5ca02bb1e.slice/crio-d1d9a7d6c0af8c3298585374f2e00e16e40eb9f4a1714860551e003c14d8d18d WatchSource:0}: Error finding container d1d9a7d6c0af8c3298585374f2e00e16e40eb9f4a1714860551e003c14d8d18d: Status 404 returned error can't find the container with id d1d9a7d6c0af8c3298585374f2e00e16e40eb9f4a1714860551e003c14d8d18d Mar 07 21:41:22.961448 master-0 kubenswrapper[16352]: I0307 21:41:22.961384 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:23.095994 master-0 kubenswrapper[16352]: I0307 21:41:23.095924 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.096113 master-0 kubenswrapper[16352]: I0307 21:41:23.096064 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.096270 master-0 kubenswrapper[16352]: I0307 21:41:23.096235 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.096406 master-0 kubenswrapper[16352]: I0307 21:41:23.096354 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:41:23.096406 master-0 kubenswrapper[16352]: I0307 21:41:23.096397 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.096495 master-0 kubenswrapper[16352]: I0307 21:41:23.096340 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run" (OuterVolumeSpecName: "var-run") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.097349 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.097397 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.097472 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.097566 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nflwb\" (UniqueName: \"kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb\") pod \"c3253e69-c364-492d-ba92-7847cd8db923\" (UID: \"c3253e69-c364-492d-ba92-7847cd8db923\") " Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.097562 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts" (OuterVolumeSpecName: "scripts") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.098929 16352 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-additional-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.098954 16352 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-log-ovn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.098970 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/c3253e69-c364-492d-ba92-7847cd8db923-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.098985 16352 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.104447 master-0 kubenswrapper[16352]: I0307 21:41:23.098998 16352 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/c3253e69-c364-492d-ba92-7847cd8db923-var-run-ovn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.106555 master-0 kubenswrapper[16352]: I0307 21:41:23.106498 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb" (OuterVolumeSpecName: "kube-api-access-nflwb") pod "c3253e69-c364-492d-ba92-7847cd8db923" (UID: "c3253e69-c364-492d-ba92-7847cd8db923"). InnerVolumeSpecName "kube-api-access-nflwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:23.202659 master-0 kubenswrapper[16352]: I0307 21:41:23.202597 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nflwb\" (UniqueName: \"kubernetes.io/projected/c3253e69-c364-492d-ba92-7847cd8db923-kube-api-access-nflwb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:23.493091 master-0 kubenswrapper[16352]: I0307 21:41:23.493013 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-wptpb-config-x2v29" Mar 07 21:41:23.493456 master-0 kubenswrapper[16352]: I0307 21:41:23.493004 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-wptpb-config-x2v29" event={"ID":"c3253e69-c364-492d-ba92-7847cd8db923","Type":"ContainerDied","Data":"48c53000e15989c224dee288f840e045cad1eb54710987d80a1d958b8c4454ed"} Mar 07 21:41:23.493456 master-0 kubenswrapper[16352]: I0307 21:41:23.493224 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="48c53000e15989c224dee288f840e045cad1eb54710987d80a1d958b8c4454ed" Mar 07 21:41:23.495703 master-0 kubenswrapper[16352]: I0307 21:41:23.495575 16352 generic.go:334] "Generic (PLEG): container finished" podID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerID="736e60a4114fa1b5c5e58807a6770c62c71f078d309417e7b6b756864891ba0e" exitCode=0 Mar 07 21:41:23.495880 master-0 kubenswrapper[16352]: I0307 21:41:23.495789 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" event={"ID":"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e","Type":"ContainerDied","Data":"736e60a4114fa1b5c5e58807a6770c62c71f078d309417e7b6b756864891ba0e"} Mar 07 21:41:23.496024 master-0 kubenswrapper[16352]: I0307 21:41:23.495995 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" event={"ID":"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e","Type":"ContainerStarted","Data":"d1d9a7d6c0af8c3298585374f2e00e16e40eb9f4a1714860551e003c14d8d18d"} Mar 07 21:41:24.111498 master-0 kubenswrapper[16352]: I0307 21:41:24.111380 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-wptpb-config-x2v29"] Mar 07 21:41:24.122912 master-0 kubenswrapper[16352]: I0307 21:41:24.122830 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-wptpb-config-x2v29"] Mar 07 21:41:24.509282 master-0 kubenswrapper[16352]: I0307 21:41:24.509120 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" event={"ID":"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e","Type":"ContainerStarted","Data":"ffaccf5640c9098eb475fc97a6d7aace0b5d4318b043d69351571c0b81b30a75"} Mar 07 21:41:24.509534 master-0 kubenswrapper[16352]: I0307 21:41:24.509305 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:24.543337 master-0 kubenswrapper[16352]: I0307 21:41:24.543181 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" podStartSLOduration=3.543153865 podStartE2EDuration="3.543153865s" podCreationTimestamp="2026-03-07 21:41:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:24.53419589 +0000 UTC m=+1407.604900979" watchObservedRunningTime="2026-03-07 21:41:24.543153865 +0000 UTC m=+1407.613858924" Mar 07 21:41:25.210212 master-0 kubenswrapper[16352]: I0307 21:41:25.210101 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3253e69-c364-492d-ba92-7847cd8db923" path="/var/lib/kubelet/pods/c3253e69-c364-492d-ba92-7847cd8db923/volumes" Mar 07 21:41:25.586111 master-0 kubenswrapper[16352]: I0307 21:41:25.585850 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 07 21:41:27.014192 master-0 kubenswrapper[16352]: I0307 21:41:27.014143 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 07 21:41:27.833351 master-0 kubenswrapper[16352]: I0307 21:41:27.833276 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-5hn4x"] Mar 07 21:41:27.833876 master-0 kubenswrapper[16352]: E0307 21:41:27.833830 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3253e69-c364-492d-ba92-7847cd8db923" containerName="ovn-config" Mar 07 21:41:27.833876 master-0 kubenswrapper[16352]: I0307 21:41:27.833852 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3253e69-c364-492d-ba92-7847cd8db923" containerName="ovn-config" Mar 07 21:41:27.834122 master-0 kubenswrapper[16352]: I0307 21:41:27.834095 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3253e69-c364-492d-ba92-7847cd8db923" containerName="ovn-config" Mar 07 21:41:27.834892 master-0 kubenswrapper[16352]: I0307 21:41:27.834861 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:27.848799 master-0 kubenswrapper[16352]: I0307 21:41:27.848720 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5hn4x"] Mar 07 21:41:27.967812 master-0 kubenswrapper[16352]: I0307 21:41:27.966954 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-66da-account-create-update-cczxb"] Mar 07 21:41:27.970027 master-0 kubenswrapper[16352]: I0307 21:41:27.968526 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:27.976749 master-0 kubenswrapper[16352]: I0307 21:41:27.976613 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-66da-account-create-update-cczxb"] Mar 07 21:41:28.020648 master-0 kubenswrapper[16352]: I0307 21:41:28.020583 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.021118 master-0 kubenswrapper[16352]: I0307 21:41:28.020804 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cqwkp\" (UniqueName: \"kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.021118 master-0 kubenswrapper[16352]: I0307 21:41:28.020850 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bmcxd\" (UniqueName: \"kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.021118 master-0 kubenswrapper[16352]: I0307 21:41:28.020917 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.033354 master-0 kubenswrapper[16352]: I0307 21:41:28.033302 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 07 21:41:28.123842 master-0 kubenswrapper[16352]: I0307 21:41:28.123759 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqwkp\" (UniqueName: \"kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.123842 master-0 kubenswrapper[16352]: I0307 21:41:28.123843 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bmcxd\" (UniqueName: \"kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.124164 master-0 kubenswrapper[16352]: I0307 21:41:28.123896 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.124164 master-0 kubenswrapper[16352]: I0307 21:41:28.123962 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.124815 master-0 kubenswrapper[16352]: I0307 21:41:28.124791 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.129610 master-0 kubenswrapper[16352]: I0307 21:41:28.129571 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.165705 master-0 kubenswrapper[16352]: I0307 21:41:28.163417 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqwkp\" (UniqueName: \"kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp\") pod \"cinder-db-create-5hn4x\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.174704 master-0 kubenswrapper[16352]: I0307 21:41:28.171552 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bmcxd\" (UniqueName: \"kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd\") pod \"cinder-66da-account-create-update-cczxb\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.174704 master-0 kubenswrapper[16352]: I0307 21:41:28.174514 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:28.232784 master-0 kubenswrapper[16352]: I0307 21:41:28.230557 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-dkp4f"] Mar 07 21:41:28.232784 master-0 kubenswrapper[16352]: I0307 21:41:28.232034 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.258125 master-0 kubenswrapper[16352]: I0307 21:41:28.256168 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dkp4f"] Mar 07 21:41:28.333707 master-0 kubenswrapper[16352]: I0307 21:41:28.327769 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-cmr5t"] Mar 07 21:41:28.333707 master-0 kubenswrapper[16352]: I0307 21:41:28.329738 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.334655 master-0 kubenswrapper[16352]: I0307 21:41:28.334594 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.334750 master-0 kubenswrapper[16352]: I0307 21:41:28.334723 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.334801 master-0 kubenswrapper[16352]: I0307 21:41:28.334753 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7pq5c\" (UniqueName: \"kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.352315 master-0 kubenswrapper[16352]: I0307 21:41:28.352233 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9nq8w\" (UniqueName: \"kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.352578 master-0 kubenswrapper[16352]: I0307 21:41:28.352474 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.356709 master-0 kubenswrapper[16352]: I0307 21:41:28.355902 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 21:41:28.356709 master-0 kubenswrapper[16352]: I0307 21:41:28.356531 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 21:41:28.356709 master-0 kubenswrapper[16352]: I0307 21:41:28.356661 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 21:41:28.357474 master-0 kubenswrapper[16352]: I0307 21:41:28.357400 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:28.415112 master-0 kubenswrapper[16352]: I0307 21:41:28.414952 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cmr5t"] Mar 07 21:41:28.463376 master-0 kubenswrapper[16352]: I0307 21:41:28.459649 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-b4a7-account-create-update-xgrjd"] Mar 07 21:41:28.463376 master-0 kubenswrapper[16352]: I0307 21:41:28.461199 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.474580 master-0 kubenswrapper[16352]: I0307 21:41:28.472387 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 07 21:41:28.474580 master-0 kubenswrapper[16352]: I0307 21:41:28.472736 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4a7-account-create-update-xgrjd"] Mar 07 21:41:28.477867 master-0 kubenswrapper[16352]: I0307 21:41:28.477511 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.477867 master-0 kubenswrapper[16352]: I0307 21:41:28.477739 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9nq8w\" (UniqueName: \"kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.477975 master-0 kubenswrapper[16352]: I0307 21:41:28.477962 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.478079 master-0 kubenswrapper[16352]: I0307 21:41:28.478043 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.478120 master-0 kubenswrapper[16352]: I0307 21:41:28.478085 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csnfr\" (UniqueName: \"kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.478247 master-0 kubenswrapper[16352]: I0307 21:41:28.478217 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.478289 master-0 kubenswrapper[16352]: I0307 21:41:28.478253 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7pq5c\" (UniqueName: \"kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.480640 master-0 kubenswrapper[16352]: I0307 21:41:28.480080 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.487751 master-0 kubenswrapper[16352]: I0307 21:41:28.487703 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.492224 master-0 kubenswrapper[16352]: I0307 21:41:28.492187 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.509268 master-0 kubenswrapper[16352]: I0307 21:41:28.509192 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7pq5c\" (UniqueName: \"kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c\") pod \"keystone-db-sync-cmr5t\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.515731 master-0 kubenswrapper[16352]: I0307 21:41:28.515663 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9nq8w\" (UniqueName: \"kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w\") pod \"neutron-db-create-dkp4f\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.581042 master-0 kubenswrapper[16352]: I0307 21:41:28.580921 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csnfr\" (UniqueName: \"kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.581310 master-0 kubenswrapper[16352]: I0307 21:41:28.581156 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.582998 master-0 kubenswrapper[16352]: I0307 21:41:28.582301 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.612905 master-0 kubenswrapper[16352]: I0307 21:41:28.612097 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csnfr\" (UniqueName: \"kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr\") pod \"neutron-b4a7-account-create-update-xgrjd\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.778550 master-0 kubenswrapper[16352]: I0307 21:41:28.774908 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:28.802896 master-0 kubenswrapper[16352]: I0307 21:41:28.802824 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:28.826728 master-0 kubenswrapper[16352]: I0307 21:41:28.826666 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:28.900174 master-0 kubenswrapper[16352]: I0307 21:41:28.900088 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-5hn4x"] Mar 07 21:41:29.355130 master-0 kubenswrapper[16352]: I0307 21:41:29.354978 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-66da-account-create-update-cczxb"] Mar 07 21:41:29.418958 master-0 kubenswrapper[16352]: I0307 21:41:29.418873 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-dkp4f"] Mar 07 21:41:29.419945 master-0 kubenswrapper[16352]: W0307 21:41:29.419539 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3fd3e133_30c5_4efb_9522_ee8423491e95.slice/crio-15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c WatchSource:0}: Error finding container 15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c: Status 404 returned error can't find the container with id 15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c Mar 07 21:41:29.531543 master-0 kubenswrapper[16352]: I0307 21:41:29.531011 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-cmr5t"] Mar 07 21:41:29.544014 master-0 kubenswrapper[16352]: I0307 21:41:29.542895 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-b4a7-account-create-update-xgrjd"] Mar 07 21:41:29.579967 master-0 kubenswrapper[16352]: I0307 21:41:29.579301 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cmr5t" event={"ID":"68d35360-cb8c-439f-8efe-f84fa02416e8","Type":"ContainerStarted","Data":"3922a0eec13a8e70abbf1749ce4253c1444ebb8788b7f51185290dfe684dec6d"} Mar 07 21:41:29.585262 master-0 kubenswrapper[16352]: I0307 21:41:29.585149 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dkp4f" event={"ID":"3fd3e133-30c5-4efb-9522-ee8423491e95","Type":"ContainerStarted","Data":"15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c"} Mar 07 21:41:29.590525 master-0 kubenswrapper[16352]: I0307 21:41:29.589983 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66da-account-create-update-cczxb" event={"ID":"1a8458eb-8fd1-4415-871a-4a9b45f21de9","Type":"ContainerStarted","Data":"6b9fcba9a1ee229c1ea3dfa892aa6a1a49e92f86bd81da32f5155bb1dba178fb"} Mar 07 21:41:29.592153 master-0 kubenswrapper[16352]: I0307 21:41:29.592094 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5hn4x" event={"ID":"e3a1e884-7f03-4892-ab50-fa35f704f9a1","Type":"ContainerStarted","Data":"3ff1bfa4c48957b3ebdbb387bb896638f5e87c9f70c2d14b5c74643ae2764dc6"} Mar 07 21:41:29.592153 master-0 kubenswrapper[16352]: I0307 21:41:29.592148 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5hn4x" event={"ID":"e3a1e884-7f03-4892-ab50-fa35f704f9a1","Type":"ContainerStarted","Data":"81414bf52764b27874ae85c9956d30e3c25983036335f5839989d1b582b09c3d"} Mar 07 21:41:29.594531 master-0 kubenswrapper[16352]: I0307 21:41:29.594319 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4a7-account-create-update-xgrjd" event={"ID":"7aed4fb8-bf15-4337-8567-30ad6ff11ce3","Type":"ContainerStarted","Data":"0ab570c0e0ead182883e3921593989b58777781de0c5a9a9fdabe78f387a8734"} Mar 07 21:41:29.624834 master-0 kubenswrapper[16352]: I0307 21:41:29.624612 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-create-5hn4x" podStartSLOduration=2.624584369 podStartE2EDuration="2.624584369s" podCreationTimestamp="2026-03-07 21:41:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:29.621970826 +0000 UTC m=+1412.692675885" watchObservedRunningTime="2026-03-07 21:41:29.624584369 +0000 UTC m=+1412.695289428" Mar 07 21:41:30.614420 master-0 kubenswrapper[16352]: I0307 21:41:30.614327 16352 generic.go:334] "Generic (PLEG): container finished" podID="1a8458eb-8fd1-4415-871a-4a9b45f21de9" containerID="0909e7a52b3666dad29a2718ce2344084b76934cbf54715aff39eb6e54b4bde1" exitCode=0 Mar 07 21:41:30.615073 master-0 kubenswrapper[16352]: I0307 21:41:30.614439 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66da-account-create-update-cczxb" event={"ID":"1a8458eb-8fd1-4415-871a-4a9b45f21de9","Type":"ContainerDied","Data":"0909e7a52b3666dad29a2718ce2344084b76934cbf54715aff39eb6e54b4bde1"} Mar 07 21:41:30.619469 master-0 kubenswrapper[16352]: I0307 21:41:30.619410 16352 generic.go:334] "Generic (PLEG): container finished" podID="e3a1e884-7f03-4892-ab50-fa35f704f9a1" containerID="3ff1bfa4c48957b3ebdbb387bb896638f5e87c9f70c2d14b5c74643ae2764dc6" exitCode=0 Mar 07 21:41:30.619571 master-0 kubenswrapper[16352]: I0307 21:41:30.619519 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5hn4x" event={"ID":"e3a1e884-7f03-4892-ab50-fa35f704f9a1","Type":"ContainerDied","Data":"3ff1bfa4c48957b3ebdbb387bb896638f5e87c9f70c2d14b5c74643ae2764dc6"} Mar 07 21:41:30.621784 master-0 kubenswrapper[16352]: I0307 21:41:30.621739 16352 generic.go:334] "Generic (PLEG): container finished" podID="7aed4fb8-bf15-4337-8567-30ad6ff11ce3" containerID="ddfa45451932b9f4486cf12c7cb8a7826c3eba8173a69005fcee8d6abe5fcaf8" exitCode=0 Mar 07 21:41:30.621852 master-0 kubenswrapper[16352]: I0307 21:41:30.621804 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4a7-account-create-update-xgrjd" event={"ID":"7aed4fb8-bf15-4337-8567-30ad6ff11ce3","Type":"ContainerDied","Data":"ddfa45451932b9f4486cf12c7cb8a7826c3eba8173a69005fcee8d6abe5fcaf8"} Mar 07 21:41:30.623980 master-0 kubenswrapper[16352]: I0307 21:41:30.623938 16352 generic.go:334] "Generic (PLEG): container finished" podID="3fd3e133-30c5-4efb-9522-ee8423491e95" containerID="1492792eff1f8c3692e3c3184a969fc94ca5abbc701364a6b2c15146a925042a" exitCode=0 Mar 07 21:41:30.624040 master-0 kubenswrapper[16352]: I0307 21:41:30.623985 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dkp4f" event={"ID":"3fd3e133-30c5-4efb-9522-ee8423491e95","Type":"ContainerDied","Data":"1492792eff1f8c3692e3c3184a969fc94ca5abbc701364a6b2c15146a925042a"} Mar 07 21:41:31.651708 master-0 kubenswrapper[16352]: I0307 21:41:31.651609 16352 generic.go:334] "Generic (PLEG): container finished" podID="c53b3432-7649-440a-b109-bb48be9f10c7" containerID="26c2967c8a278140c4cf0ec5f5f1a8d674f8cc49abe5d0529afedb2063bf41c2" exitCode=0 Mar 07 21:41:31.652576 master-0 kubenswrapper[16352]: I0307 21:41:31.651767 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s9668" event={"ID":"c53b3432-7649-440a-b109-bb48be9f10c7","Type":"ContainerDied","Data":"26c2967c8a278140c4cf0ec5f5f1a8d674f8cc49abe5d0529afedb2063bf41c2"} Mar 07 21:41:32.225237 master-0 kubenswrapper[16352]: I0307 21:41:32.225127 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:41:32.382081 master-0 kubenswrapper[16352]: I0307 21:41:32.380634 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:41:32.382081 master-0 kubenswrapper[16352]: I0307 21:41:32.381061 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="dnsmasq-dns" containerID="cri-o://c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5" gracePeriod=10 Mar 07 21:41:34.708176 master-0 kubenswrapper[16352]: I0307 21:41:34.706522 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-5hn4x" event={"ID":"e3a1e884-7f03-4892-ab50-fa35f704f9a1","Type":"ContainerDied","Data":"81414bf52764b27874ae85c9956d30e3c25983036335f5839989d1b582b09c3d"} Mar 07 21:41:34.708176 master-0 kubenswrapper[16352]: I0307 21:41:34.706867 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81414bf52764b27874ae85c9956d30e3c25983036335f5839989d1b582b09c3d" Mar 07 21:41:34.908539 master-0 kubenswrapper[16352]: I0307 21:41:34.904977 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:34.949925 master-0 kubenswrapper[16352]: I0307 21:41:34.942856 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:34.970734 master-0 kubenswrapper[16352]: I0307 21:41:34.956316 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:34.970734 master-0 kubenswrapper[16352]: I0307 21:41:34.965582 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:35.013732 master-0 kubenswrapper[16352]: I0307 21:41:35.013667 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts\") pod \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " Mar 07 21:41:35.013947 master-0 kubenswrapper[16352]: I0307 21:41:35.013766 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts\") pod \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " Mar 07 21:41:35.013947 master-0 kubenswrapper[16352]: I0307 21:41:35.013801 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts\") pod \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " Mar 07 21:41:35.013947 master-0 kubenswrapper[16352]: I0307 21:41:35.013916 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bmcxd\" (UniqueName: \"kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd\") pod \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\" (UID: \"1a8458eb-8fd1-4415-871a-4a9b45f21de9\") " Mar 07 21:41:35.014058 master-0 kubenswrapper[16352]: I0307 21:41:35.014034 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts\") pod \"3fd3e133-30c5-4efb-9522-ee8423491e95\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " Mar 07 21:41:35.014100 master-0 kubenswrapper[16352]: I0307 21:41:35.014088 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9nq8w\" (UniqueName: \"kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w\") pod \"3fd3e133-30c5-4efb-9522-ee8423491e95\" (UID: \"3fd3e133-30c5-4efb-9522-ee8423491e95\") " Mar 07 21:41:35.014143 master-0 kubenswrapper[16352]: I0307 21:41:35.014111 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csnfr\" (UniqueName: \"kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr\") pod \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\" (UID: \"7aed4fb8-bf15-4337-8567-30ad6ff11ce3\") " Mar 07 21:41:35.014187 master-0 kubenswrapper[16352]: I0307 21:41:35.014179 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cqwkp\" (UniqueName: \"kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp\") pod \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\" (UID: \"e3a1e884-7f03-4892-ab50-fa35f704f9a1\") " Mar 07 21:41:35.021399 master-0 kubenswrapper[16352]: I0307 21:41:35.020686 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp" (OuterVolumeSpecName: "kube-api-access-cqwkp") pod "e3a1e884-7f03-4892-ab50-fa35f704f9a1" (UID: "e3a1e884-7f03-4892-ab50-fa35f704f9a1"). InnerVolumeSpecName "kube-api-access-cqwkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.021711 master-0 kubenswrapper[16352]: I0307 21:41:35.021641 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1a8458eb-8fd1-4415-871a-4a9b45f21de9" (UID: "1a8458eb-8fd1-4415-871a-4a9b45f21de9"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.022088 master-0 kubenswrapper[16352]: I0307 21:41:35.022059 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "e3a1e884-7f03-4892-ab50-fa35f704f9a1" (UID: "e3a1e884-7f03-4892-ab50-fa35f704f9a1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.022599 master-0 kubenswrapper[16352]: I0307 21:41:35.022520 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3fd3e133-30c5-4efb-9522-ee8423491e95" (UID: "3fd3e133-30c5-4efb-9522-ee8423491e95"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.023528 master-0 kubenswrapper[16352]: I0307 21:41:35.023503 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7aed4fb8-bf15-4337-8567-30ad6ff11ce3" (UID: "7aed4fb8-bf15-4337-8567-30ad6ff11ce3"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.024949 master-0 kubenswrapper[16352]: I0307 21:41:35.024906 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s9668" Mar 07 21:41:35.026113 master-0 kubenswrapper[16352]: I0307 21:41:35.026076 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd" (OuterVolumeSpecName: "kube-api-access-bmcxd") pod "1a8458eb-8fd1-4415-871a-4a9b45f21de9" (UID: "1a8458eb-8fd1-4415-871a-4a9b45f21de9"). InnerVolumeSpecName "kube-api-access-bmcxd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.030174 master-0 kubenswrapper[16352]: I0307 21:41:35.030078 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w" (OuterVolumeSpecName: "kube-api-access-9nq8w") pod "3fd3e133-30c5-4efb-9522-ee8423491e95" (UID: "3fd3e133-30c5-4efb-9522-ee8423491e95"). InnerVolumeSpecName "kube-api-access-9nq8w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.030523 master-0 kubenswrapper[16352]: I0307 21:41:35.030477 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr" (OuterVolumeSpecName: "kube-api-access-csnfr") pod "7aed4fb8-bf15-4337-8567-30ad6ff11ce3" (UID: "7aed4fb8-bf15-4337-8567-30ad6ff11ce3"). InnerVolumeSpecName "kube-api-access-csnfr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.117344 master-0 kubenswrapper[16352]: I0307 21:41:35.117276 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data\") pod \"c53b3432-7649-440a-b109-bb48be9f10c7\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " Mar 07 21:41:35.117568 master-0 kubenswrapper[16352]: I0307 21:41:35.117475 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcq7j\" (UniqueName: \"kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j\") pod \"c53b3432-7649-440a-b109-bb48be9f10c7\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " Mar 07 21:41:35.117629 master-0 kubenswrapper[16352]: I0307 21:41:35.117610 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle\") pod \"c53b3432-7649-440a-b109-bb48be9f10c7\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " Mar 07 21:41:35.117727 master-0 kubenswrapper[16352]: I0307 21:41:35.117665 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data\") pod \"c53b3432-7649-440a-b109-bb48be9f10c7\" (UID: \"c53b3432-7649-440a-b109-bb48be9f10c7\") " Mar 07 21:41:35.118609 master-0 kubenswrapper[16352]: I0307 21:41:35.118575 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bmcxd\" (UniqueName: \"kubernetes.io/projected/1a8458eb-8fd1-4415-871a-4a9b45f21de9-kube-api-access-bmcxd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118609 master-0 kubenswrapper[16352]: I0307 21:41:35.118607 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3fd3e133-30c5-4efb-9522-ee8423491e95-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118626 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9nq8w\" (UniqueName: \"kubernetes.io/projected/3fd3e133-30c5-4efb-9522-ee8423491e95-kube-api-access-9nq8w\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118642 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csnfr\" (UniqueName: \"kubernetes.io/projected/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-kube-api-access-csnfr\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118658 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cqwkp\" (UniqueName: \"kubernetes.io/projected/e3a1e884-7f03-4892-ab50-fa35f704f9a1-kube-api-access-cqwkp\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118673 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7aed4fb8-bf15-4337-8567-30ad6ff11ce3-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118691 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1a8458eb-8fd1-4415-871a-4a9b45f21de9-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.118779 master-0 kubenswrapper[16352]: I0307 21:41:35.118721 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/e3a1e884-7f03-4892-ab50-fa35f704f9a1-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.122548 master-0 kubenswrapper[16352]: I0307 21:41:35.122501 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j" (OuterVolumeSpecName: "kube-api-access-rcq7j") pod "c53b3432-7649-440a-b109-bb48be9f10c7" (UID: "c53b3432-7649-440a-b109-bb48be9f10c7"). InnerVolumeSpecName "kube-api-access-rcq7j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.123160 master-0 kubenswrapper[16352]: I0307 21:41:35.123137 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c53b3432-7649-440a-b109-bb48be9f10c7" (UID: "c53b3432-7649-440a-b109-bb48be9f10c7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:35.128077 master-0 kubenswrapper[16352]: I0307 21:41:35.128054 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:41:35.148013 master-0 kubenswrapper[16352]: I0307 21:41:35.147946 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c53b3432-7649-440a-b109-bb48be9f10c7" (UID: "c53b3432-7649-440a-b109-bb48be9f10c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:35.172807 master-0 kubenswrapper[16352]: I0307 21:41:35.172761 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data" (OuterVolumeSpecName: "config-data") pod "c53b3432-7649-440a-b109-bb48be9f10c7" (UID: "c53b3432-7649-440a-b109-bb48be9f10c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:35.220396 master-0 kubenswrapper[16352]: I0307 21:41:35.220075 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb\") pod \"f910f111-201a-459a-a4e6-3dd57dc69897\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " Mar 07 21:41:35.220396 master-0 kubenswrapper[16352]: I0307 21:41:35.220179 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb\") pod \"f910f111-201a-459a-a4e6-3dd57dc69897\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " Mar 07 21:41:35.220396 master-0 kubenswrapper[16352]: I0307 21:41:35.220214 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config\") pod \"f910f111-201a-459a-a4e6-3dd57dc69897\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " Mar 07 21:41:35.220396 master-0 kubenswrapper[16352]: I0307 21:41:35.220270 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc\") pod \"f910f111-201a-459a-a4e6-3dd57dc69897\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " Mar 07 21:41:35.220396 master-0 kubenswrapper[16352]: I0307 21:41:35.220322 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kmkw6\" (UniqueName: \"kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6\") pod \"f910f111-201a-459a-a4e6-3dd57dc69897\" (UID: \"f910f111-201a-459a-a4e6-3dd57dc69897\") " Mar 07 21:41:35.221130 master-0 kubenswrapper[16352]: I0307 21:41:35.220711 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.221130 master-0 kubenswrapper[16352]: I0307 21:41:35.220738 16352 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.221130 master-0 kubenswrapper[16352]: I0307 21:41:35.220751 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c53b3432-7649-440a-b109-bb48be9f10c7-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.221130 master-0 kubenswrapper[16352]: I0307 21:41:35.220762 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcq7j\" (UniqueName: \"kubernetes.io/projected/c53b3432-7649-440a-b109-bb48be9f10c7-kube-api-access-rcq7j\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.228831 master-0 kubenswrapper[16352]: I0307 21:41:35.228740 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6" (OuterVolumeSpecName: "kube-api-access-kmkw6") pod "f910f111-201a-459a-a4e6-3dd57dc69897" (UID: "f910f111-201a-459a-a4e6-3dd57dc69897"). InnerVolumeSpecName "kube-api-access-kmkw6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:35.323390 master-0 kubenswrapper[16352]: I0307 21:41:35.323323 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kmkw6\" (UniqueName: \"kubernetes.io/projected/f910f111-201a-459a-a4e6-3dd57dc69897-kube-api-access-kmkw6\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.335881 master-0 kubenswrapper[16352]: I0307 21:41:35.335782 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f910f111-201a-459a-a4e6-3dd57dc69897" (UID: "f910f111-201a-459a-a4e6-3dd57dc69897"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.340979 master-0 kubenswrapper[16352]: I0307 21:41:35.337829 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config" (OuterVolumeSpecName: "config") pod "f910f111-201a-459a-a4e6-3dd57dc69897" (UID: "f910f111-201a-459a-a4e6-3dd57dc69897"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.340979 master-0 kubenswrapper[16352]: I0307 21:41:35.340758 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f910f111-201a-459a-a4e6-3dd57dc69897" (UID: "f910f111-201a-459a-a4e6-3dd57dc69897"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.369896 master-0 kubenswrapper[16352]: I0307 21:41:35.369785 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f910f111-201a-459a-a4e6-3dd57dc69897" (UID: "f910f111-201a-459a-a4e6-3dd57dc69897"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:35.427281 master-0 kubenswrapper[16352]: I0307 21:41:35.427159 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.427281 master-0 kubenswrapper[16352]: I0307 21:41:35.427271 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.427281 master-0 kubenswrapper[16352]: I0307 21:41:35.427294 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.427281 master-0 kubenswrapper[16352]: I0307 21:41:35.427310 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f910f111-201a-459a-a4e6-3dd57dc69897-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:35.724900 master-0 kubenswrapper[16352]: I0307 21:41:35.724835 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cmr5t" event={"ID":"68d35360-cb8c-439f-8efe-f84fa02416e8","Type":"ContainerStarted","Data":"ce0343eb6e31502152abf1d3207c481b00edb500c99ae5b13af3b1b83142fd0f"} Mar 07 21:41:35.731192 master-0 kubenswrapper[16352]: I0307 21:41:35.731144 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-dkp4f" event={"ID":"3fd3e133-30c5-4efb-9522-ee8423491e95","Type":"ContainerDied","Data":"15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c"} Mar 07 21:41:35.731192 master-0 kubenswrapper[16352]: I0307 21:41:35.731190 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="15df50414c2b54d80ebb5920ce8f4d788f0a46315d90fe2c35c755e7e609d92c" Mar 07 21:41:35.731391 master-0 kubenswrapper[16352]: I0307 21:41:35.731295 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-dkp4f" Mar 07 21:41:35.736008 master-0 kubenswrapper[16352]: I0307 21:41:35.735977 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-66da-account-create-update-cczxb" Mar 07 21:41:35.736360 master-0 kubenswrapper[16352]: I0307 21:41:35.736327 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-66da-account-create-update-cczxb" event={"ID":"1a8458eb-8fd1-4415-871a-4a9b45f21de9","Type":"ContainerDied","Data":"6b9fcba9a1ee229c1ea3dfa892aa6a1a49e92f86bd81da32f5155bb1dba178fb"} Mar 07 21:41:35.736360 master-0 kubenswrapper[16352]: I0307 21:41:35.736354 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6b9fcba9a1ee229c1ea3dfa892aa6a1a49e92f86bd81da32f5155bb1dba178fb" Mar 07 21:41:35.741168 master-0 kubenswrapper[16352]: I0307 21:41:35.739347 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-b4a7-account-create-update-xgrjd" event={"ID":"7aed4fb8-bf15-4337-8567-30ad6ff11ce3","Type":"ContainerDied","Data":"0ab570c0e0ead182883e3921593989b58777781de0c5a9a9fdabe78f387a8734"} Mar 07 21:41:35.741243 master-0 kubenswrapper[16352]: I0307 21:41:35.741169 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0ab570c0e0ead182883e3921593989b58777781de0c5a9a9fdabe78f387a8734" Mar 07 21:41:35.741299 master-0 kubenswrapper[16352]: I0307 21:41:35.741246 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-b4a7-account-create-update-xgrjd" Mar 07 21:41:35.745048 master-0 kubenswrapper[16352]: I0307 21:41:35.745008 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-s9668" Mar 07 21:41:35.745974 master-0 kubenswrapper[16352]: I0307 21:41:35.745942 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-s9668" event={"ID":"c53b3432-7649-440a-b109-bb48be9f10c7","Type":"ContainerDied","Data":"722c174453d42d2d76049c92375dd65316a0792ae745900c1ff848f26156bc0b"} Mar 07 21:41:35.746060 master-0 kubenswrapper[16352]: I0307 21:41:35.745981 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="722c174453d42d2d76049c92375dd65316a0792ae745900c1ff848f26156bc0b" Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754487 16352 generic.go:334] "Generic (PLEG): container finished" podID="f910f111-201a-459a-a4e6-3dd57dc69897" containerID="c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5" exitCode=0 Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754681 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-5hn4x" Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754805 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754865 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" event={"ID":"f910f111-201a-459a-a4e6-3dd57dc69897","Type":"ContainerDied","Data":"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5"} Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754933 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d6c6c44c5-7fbfp" event={"ID":"f910f111-201a-459a-a4e6-3dd57dc69897","Type":"ContainerDied","Data":"45c0ccfd744c65b0f0b187f38a2782650198e4e99723e17a8a57dc40a48429e4"} Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.754964 16352 scope.go:117] "RemoveContainer" containerID="c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5" Mar 07 21:41:35.763735 master-0 kubenswrapper[16352]: I0307 21:41:35.758233 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-cmr5t" podStartSLOduration=2.5454993249999998 podStartE2EDuration="7.758217301s" podCreationTimestamp="2026-03-07 21:41:28 +0000 UTC" firstStartedPulling="2026-03-07 21:41:29.53920797 +0000 UTC m=+1412.609913029" lastFinishedPulling="2026-03-07 21:41:34.751925926 +0000 UTC m=+1417.822631005" observedRunningTime="2026-03-07 21:41:35.752908013 +0000 UTC m=+1418.823613102" watchObservedRunningTime="2026-03-07 21:41:35.758217301 +0000 UTC m=+1418.828922360" Mar 07 21:41:35.809404 master-0 kubenswrapper[16352]: I0307 21:41:35.809346 16352 scope.go:117] "RemoveContainer" containerID="0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818" Mar 07 21:41:35.819923 master-0 kubenswrapper[16352]: I0307 21:41:35.819863 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:41:35.841054 master-0 kubenswrapper[16352]: I0307 21:41:35.839750 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d6c6c44c5-7fbfp"] Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: I0307 21:41:35.847953 16352 scope.go:117] "RemoveContainer" containerID="c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5" Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: E0307 21:41:35.848557 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5\": container with ID starting with c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5 not found: ID does not exist" containerID="c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5" Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: I0307 21:41:35.848609 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5"} err="failed to get container status \"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5\": rpc error: code = NotFound desc = could not find container \"c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5\": container with ID starting with c26d23cc060c1a97440960f9656b053d3639754456aaaf4f107f90c97e89bcd5 not found: ID does not exist" Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: I0307 21:41:35.848636 16352 scope.go:117] "RemoveContainer" containerID="0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818" Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: E0307 21:41:35.849499 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818\": container with ID starting with 0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818 not found: ID does not exist" containerID="0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818" Mar 07 21:41:35.851550 master-0 kubenswrapper[16352]: I0307 21:41:35.849527 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818"} err="failed to get container status \"0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818\": rpc error: code = NotFound desc = could not find container \"0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818\": container with ID starting with 0dc5f6791865e45a0f24d159607363f7a53b86b181bd61706dee466d3a90a818 not found: ID does not exist" Mar 07 21:41:36.640513 master-0 kubenswrapper[16352]: I0307 21:41:36.640415 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:36.648509 master-0 kubenswrapper[16352]: E0307 21:41:36.648414 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e3a1e884-7f03-4892-ab50-fa35f704f9a1" containerName="mariadb-database-create" Mar 07 21:41:36.648509 master-0 kubenswrapper[16352]: I0307 21:41:36.648521 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e3a1e884-7f03-4892-ab50-fa35f704f9a1" containerName="mariadb-database-create" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: E0307 21:41:36.648575 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c53b3432-7649-440a-b109-bb48be9f10c7" containerName="glance-db-sync" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: I0307 21:41:36.648584 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c53b3432-7649-440a-b109-bb48be9f10c7" containerName="glance-db-sync" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: E0307 21:41:36.648624 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="dnsmasq-dns" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: I0307 21:41:36.648634 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="dnsmasq-dns" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: E0307 21:41:36.648670 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="init" Mar 07 21:41:36.648817 master-0 kubenswrapper[16352]: I0307 21:41:36.648681 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="init" Mar 07 21:41:36.649037 master-0 kubenswrapper[16352]: E0307 21:41:36.648879 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7aed4fb8-bf15-4337-8567-30ad6ff11ce3" containerName="mariadb-account-create-update" Mar 07 21:41:36.649037 master-0 kubenswrapper[16352]: I0307 21:41:36.648890 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="7aed4fb8-bf15-4337-8567-30ad6ff11ce3" containerName="mariadb-account-create-update" Mar 07 21:41:36.649037 master-0 kubenswrapper[16352]: E0307 21:41:36.648907 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1a8458eb-8fd1-4415-871a-4a9b45f21de9" containerName="mariadb-account-create-update" Mar 07 21:41:36.649037 master-0 kubenswrapper[16352]: I0307 21:41:36.648913 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="1a8458eb-8fd1-4415-871a-4a9b45f21de9" containerName="mariadb-account-create-update" Mar 07 21:41:36.649037 master-0 kubenswrapper[16352]: E0307 21:41:36.649022 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3fd3e133-30c5-4efb-9522-ee8423491e95" containerName="mariadb-database-create" Mar 07 21:41:36.649193 master-0 kubenswrapper[16352]: I0307 21:41:36.649051 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="3fd3e133-30c5-4efb-9522-ee8423491e95" containerName="mariadb-database-create" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651092 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="3fd3e133-30c5-4efb-9522-ee8423491e95" containerName="mariadb-database-create" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651150 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" containerName="dnsmasq-dns" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651317 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e3a1e884-7f03-4892-ab50-fa35f704f9a1" containerName="mariadb-database-create" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651343 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="7aed4fb8-bf15-4337-8567-30ad6ff11ce3" containerName="mariadb-account-create-update" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651359 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="1a8458eb-8fd1-4415-871a-4a9b45f21de9" containerName="mariadb-account-create-update" Mar 07 21:41:36.651460 master-0 kubenswrapper[16352]: I0307 21:41:36.651373 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c53b3432-7649-440a-b109-bb48be9f10c7" containerName="glance-db-sync" Mar 07 21:41:36.657421 master-0 kubenswrapper[16352]: I0307 21:41:36.657377 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.662623 master-0 kubenswrapper[16352]: I0307 21:41:36.662541 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:36.772396 master-0 kubenswrapper[16352]: I0307 21:41:36.772301 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.772396 master-0 kubenswrapper[16352]: I0307 21:41:36.772400 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.773187 master-0 kubenswrapper[16352]: I0307 21:41:36.772473 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.773187 master-0 kubenswrapper[16352]: I0307 21:41:36.772514 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.773187 master-0 kubenswrapper[16352]: I0307 21:41:36.772568 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.773187 master-0 kubenswrapper[16352]: I0307 21:41:36.772640 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-chwch\" (UniqueName: \"kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.876068 master-0 kubenswrapper[16352]: I0307 21:41:36.875669 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.876698 master-0 kubenswrapper[16352]: I0307 21:41:36.876432 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.876698 master-0 kubenswrapper[16352]: I0307 21:41:36.876552 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.876840 master-0 kubenswrapper[16352]: I0307 21:41:36.876720 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.877410 master-0 kubenswrapper[16352]: I0307 21:41:36.877188 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-chwch\" (UniqueName: \"kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.877410 master-0 kubenswrapper[16352]: I0307 21:41:36.877248 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.877512 master-0 kubenswrapper[16352]: I0307 21:41:36.877451 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.880127 master-0 kubenswrapper[16352]: I0307 21:41:36.877983 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.880127 master-0 kubenswrapper[16352]: I0307 21:41:36.878206 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.880127 master-0 kubenswrapper[16352]: I0307 21:41:36.878349 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.891333 master-0 kubenswrapper[16352]: I0307 21:41:36.884663 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.910566 master-0 kubenswrapper[16352]: I0307 21:41:36.901897 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-chwch\" (UniqueName: \"kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch\") pod \"dnsmasq-dns-5f5db5bd5-2tvbr\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:36.997509 master-0 kubenswrapper[16352]: I0307 21:41:36.997351 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:37.229855 master-0 kubenswrapper[16352]: I0307 21:41:37.229756 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f910f111-201a-459a-a4e6-3dd57dc69897" path="/var/lib/kubelet/pods/f910f111-201a-459a-a4e6-3dd57dc69897/volumes" Mar 07 21:41:37.583987 master-0 kubenswrapper[16352]: I0307 21:41:37.583903 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:37.831962 master-0 kubenswrapper[16352]: I0307 21:41:37.831809 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerStarted","Data":"5df8e8a3264cd814944acefabeac04ebb9cbc7b8dfa8424a37692ecf6661a3ee"} Mar 07 21:41:37.831962 master-0 kubenswrapper[16352]: I0307 21:41:37.831897 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerStarted","Data":"14e763f09afb65fe26dbf38f80617e1b4b9d477ca16b37db1433680204a476d1"} Mar 07 21:41:38.867733 master-0 kubenswrapper[16352]: I0307 21:41:38.867296 16352 generic.go:334] "Generic (PLEG): container finished" podID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerID="5df8e8a3264cd814944acefabeac04ebb9cbc7b8dfa8424a37692ecf6661a3ee" exitCode=0 Mar 07 21:41:38.867733 master-0 kubenswrapper[16352]: I0307 21:41:38.867382 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerDied","Data":"5df8e8a3264cd814944acefabeac04ebb9cbc7b8dfa8424a37692ecf6661a3ee"} Mar 07 21:41:38.867733 master-0 kubenswrapper[16352]: I0307 21:41:38.867422 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerStarted","Data":"b7aa89c5fe579cebb0a26735d81d0685eba148e0dd013740ca1cd478a70320a9"} Mar 07 21:41:38.876737 master-0 kubenswrapper[16352]: I0307 21:41:38.868930 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:38.916750 master-0 kubenswrapper[16352]: I0307 21:41:38.916394 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" podStartSLOduration=2.916361919 podStartE2EDuration="2.916361919s" podCreationTimestamp="2026-03-07 21:41:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:38.909641408 +0000 UTC m=+1421.980346487" watchObservedRunningTime="2026-03-07 21:41:38.916361919 +0000 UTC m=+1421.987066978" Mar 07 21:41:39.886462 master-0 kubenswrapper[16352]: I0307 21:41:39.886363 16352 generic.go:334] "Generic (PLEG): container finished" podID="68d35360-cb8c-439f-8efe-f84fa02416e8" containerID="ce0343eb6e31502152abf1d3207c481b00edb500c99ae5b13af3b1b83142fd0f" exitCode=0 Mar 07 21:41:39.888313 master-0 kubenswrapper[16352]: I0307 21:41:39.888131 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cmr5t" event={"ID":"68d35360-cb8c-439f-8efe-f84fa02416e8","Type":"ContainerDied","Data":"ce0343eb6e31502152abf1d3207c481b00edb500c99ae5b13af3b1b83142fd0f"} Mar 07 21:41:41.398227 master-0 kubenswrapper[16352]: I0307 21:41:41.398124 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:41.543265 master-0 kubenswrapper[16352]: I0307 21:41:41.543198 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7pq5c\" (UniqueName: \"kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c\") pod \"68d35360-cb8c-439f-8efe-f84fa02416e8\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " Mar 07 21:41:41.543539 master-0 kubenswrapper[16352]: I0307 21:41:41.543467 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle\") pod \"68d35360-cb8c-439f-8efe-f84fa02416e8\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " Mar 07 21:41:41.543617 master-0 kubenswrapper[16352]: I0307 21:41:41.543592 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data\") pod \"68d35360-cb8c-439f-8efe-f84fa02416e8\" (UID: \"68d35360-cb8c-439f-8efe-f84fa02416e8\") " Mar 07 21:41:41.548200 master-0 kubenswrapper[16352]: I0307 21:41:41.547717 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c" (OuterVolumeSpecName: "kube-api-access-7pq5c") pod "68d35360-cb8c-439f-8efe-f84fa02416e8" (UID: "68d35360-cb8c-439f-8efe-f84fa02416e8"). InnerVolumeSpecName "kube-api-access-7pq5c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:41.589024 master-0 kubenswrapper[16352]: I0307 21:41:41.588919 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "68d35360-cb8c-439f-8efe-f84fa02416e8" (UID: "68d35360-cb8c-439f-8efe-f84fa02416e8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:41.601623 master-0 kubenswrapper[16352]: I0307 21:41:41.601541 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data" (OuterVolumeSpecName: "config-data") pod "68d35360-cb8c-439f-8efe-f84fa02416e8" (UID: "68d35360-cb8c-439f-8efe-f84fa02416e8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:41.647717 master-0 kubenswrapper[16352]: I0307 21:41:41.647555 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:41.647717 master-0 kubenswrapper[16352]: I0307 21:41:41.647665 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68d35360-cb8c-439f-8efe-f84fa02416e8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:41.648162 master-0 kubenswrapper[16352]: I0307 21:41:41.647829 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7pq5c\" (UniqueName: \"kubernetes.io/projected/68d35360-cb8c-439f-8efe-f84fa02416e8-kube-api-access-7pq5c\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:41.916929 master-0 kubenswrapper[16352]: I0307 21:41:41.916854 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-cmr5t" event={"ID":"68d35360-cb8c-439f-8efe-f84fa02416e8","Type":"ContainerDied","Data":"3922a0eec13a8e70abbf1749ce4253c1444ebb8788b7f51185290dfe684dec6d"} Mar 07 21:41:41.916929 master-0 kubenswrapper[16352]: I0307 21:41:41.916917 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3922a0eec13a8e70abbf1749ce4253c1444ebb8788b7f51185290dfe684dec6d" Mar 07 21:41:41.917266 master-0 kubenswrapper[16352]: I0307 21:41:41.917000 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-cmr5t" Mar 07 21:41:42.284405 master-0 kubenswrapper[16352]: I0307 21:41:42.281774 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-m6pmp"] Mar 07 21:41:42.284405 master-0 kubenswrapper[16352]: E0307 21:41:42.282518 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68d35360-cb8c-439f-8efe-f84fa02416e8" containerName="keystone-db-sync" Mar 07 21:41:42.284405 master-0 kubenswrapper[16352]: I0307 21:41:42.282539 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="68d35360-cb8c-439f-8efe-f84fa02416e8" containerName="keystone-db-sync" Mar 07 21:41:42.284405 master-0 kubenswrapper[16352]: I0307 21:41:42.282822 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="68d35360-cb8c-439f-8efe-f84fa02416e8" containerName="keystone-db-sync" Mar 07 21:41:42.284405 master-0 kubenswrapper[16352]: I0307 21:41:42.283840 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.295575 master-0 kubenswrapper[16352]: I0307 21:41:42.294172 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 21:41:42.295575 master-0 kubenswrapper[16352]: I0307 21:41:42.294244 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 21:41:42.295575 master-0 kubenswrapper[16352]: I0307 21:41:42.294185 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 21:41:42.295575 master-0 kubenswrapper[16352]: I0307 21:41:42.294448 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 21:41:42.316990 master-0 kubenswrapper[16352]: I0307 21:41:42.316905 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:42.317430 master-0 kubenswrapper[16352]: I0307 21:41:42.317334 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="dnsmasq-dns" containerID="cri-o://b7aa89c5fe579cebb0a26735d81d0685eba148e0dd013740ca1cd478a70320a9" gracePeriod=10 Mar 07 21:41:42.338089 master-0 kubenswrapper[16352]: I0307 21:41:42.338032 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m6pmp"] Mar 07 21:41:42.428321 master-0 kubenswrapper[16352]: I0307 21:41:42.427745 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428321 master-0 kubenswrapper[16352]: I0307 21:41:42.428098 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428321 master-0 kubenswrapper[16352]: I0307 21:41:42.428147 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qz65x\" (UniqueName: \"kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428321 master-0 kubenswrapper[16352]: I0307 21:41:42.428218 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428321 master-0 kubenswrapper[16352]: I0307 21:41:42.428302 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428913 master-0 kubenswrapper[16352]: I0307 21:41:42.428382 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.428913 master-0 kubenswrapper[16352]: I0307 21:41:42.428555 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:42.461153 master-0 kubenswrapper[16352]: I0307 21:41:42.461087 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532176 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532287 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532377 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532442 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-68qpz\" (UniqueName: \"kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532506 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532559 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qz65x\" (UniqueName: \"kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532635 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532698 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532718 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532755 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.533712 master-0 kubenswrapper[16352]: I0307 21:41:42.532793 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.541315 master-0 kubenswrapper[16352]: I0307 21:41:42.541236 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.541698 master-0 kubenswrapper[16352]: I0307 21:41:42.541366 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.541811 master-0 kubenswrapper[16352]: I0307 21:41:42.541379 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.547119 master-0 kubenswrapper[16352]: I0307 21:41:42.546459 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.557720 master-0 kubenswrapper[16352]: I0307 21:41:42.557618 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.618590 master-0 kubenswrapper[16352]: I0307 21:41:42.617732 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qz65x\" (UniqueName: \"kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x\") pod \"keystone-bootstrap-m6pmp\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.626730 master-0 kubenswrapper[16352]: I0307 21:41:42.625762 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:42.640873 master-0 kubenswrapper[16352]: I0307 21:41:42.639207 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.640873 master-0 kubenswrapper[16352]: I0307 21:41:42.639721 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.640873 master-0 kubenswrapper[16352]: I0307 21:41:42.640612 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.641222 master-0 kubenswrapper[16352]: I0307 21:41:42.640958 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.641222 master-0 kubenswrapper[16352]: I0307 21:41:42.641118 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.641320 master-0 kubenswrapper[16352]: I0307 21:41:42.641224 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-68qpz\" (UniqueName: \"kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.642499 master-0 kubenswrapper[16352]: I0307 21:41:42.642455 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.674519 master-0 kubenswrapper[16352]: I0307 21:41:42.668744 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.674519 master-0 kubenswrapper[16352]: I0307 21:41:42.671600 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.677023 master-0 kubenswrapper[16352]: I0307 21:41:42.676971 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.683628 master-0 kubenswrapper[16352]: I0307 21:41:42.683464 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.697170 master-0 kubenswrapper[16352]: I0307 21:41:42.696128 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-68qpz\" (UniqueName: \"kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz\") pod \"dnsmasq-dns-5787b6ddf7-gjnck\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.737868 master-0 kubenswrapper[16352]: I0307 21:41:42.737776 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-create-j9hg2"] Mar 07 21:41:42.744443 master-0 kubenswrapper[16352]: I0307 21:41:42.739622 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.753502 master-0 kubenswrapper[16352]: I0307 21:41:42.753435 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:42.801485 master-0 kubenswrapper[16352]: I0307 21:41:42.801269 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-db-sync-m7xht"] Mar 07 21:41:42.803728 master-0 kubenswrapper[16352]: I0307 21:41:42.803663 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.811942 master-0 kubenswrapper[16352]: I0307 21:41:42.811649 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-scripts" Mar 07 21:41:42.812187 master-0 kubenswrapper[16352]: I0307 21:41:42.811761 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-config-data" Mar 07 21:41:42.848396 master-0 kubenswrapper[16352]: I0307 21:41:42.848273 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.848920 master-0 kubenswrapper[16352]: I0307 21:41:42.848455 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfbdc\" (UniqueName: \"kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.848920 master-0 kubenswrapper[16352]: I0307 21:41:42.848494 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.848920 master-0 kubenswrapper[16352]: I0307 21:41:42.848575 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v25l\" (UniqueName: \"kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.850346 master-0 kubenswrapper[16352]: I0307 21:41:42.850312 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.850401 master-0 kubenswrapper[16352]: I0307 21:41:42.850375 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.850580 master-0 kubenswrapper[16352]: I0307 21:41:42.850549 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.850789 master-0 kubenswrapper[16352]: I0307 21:41:42.850740 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.853441 master-0 kubenswrapper[16352]: I0307 21:41:42.852869 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-j9hg2"] Mar 07 21:41:42.858981 master-0 kubenswrapper[16352]: I0307 21:41:42.858927 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:42.891334 master-0 kubenswrapper[16352]: I0307 21:41:42.890591 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-db-sync-m7xht"] Mar 07 21:41:42.964033 master-0 kubenswrapper[16352]: I0307 21:41:42.963956 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v25l\" (UniqueName: \"kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.964133 master-0 kubenswrapper[16352]: I0307 21:41:42.964055 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.964133 master-0 kubenswrapper[16352]: I0307 21:41:42.964108 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.964217 master-0 kubenswrapper[16352]: I0307 21:41:42.964197 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.964289 master-0 kubenswrapper[16352]: I0307 21:41:42.964267 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.964363 master-0 kubenswrapper[16352]: I0307 21:41:42.964332 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.964480 master-0 kubenswrapper[16352]: I0307 21:41:42.964455 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfbdc\" (UniqueName: \"kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.964530 master-0 kubenswrapper[16352]: I0307 21:41:42.964511 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.966880 master-0 kubenswrapper[16352]: I0307 21:41:42.966819 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.969465 master-0 kubenswrapper[16352]: I0307 21:41:42.969433 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.971259 master-0 kubenswrapper[16352]: I0307 21:41:42.971197 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:42.976269 master-0 kubenswrapper[16352]: I0307 21:41:42.976165 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.977011 master-0 kubenswrapper[16352]: I0307 21:41:42.976520 16352 generic.go:334] "Generic (PLEG): container finished" podID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerID="b7aa89c5fe579cebb0a26735d81d0685eba148e0dd013740ca1cd478a70320a9" exitCode=0 Mar 07 21:41:42.977011 master-0 kubenswrapper[16352]: I0307 21:41:42.976570 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerDied","Data":"b7aa89c5fe579cebb0a26735d81d0685eba148e0dd013740ca1cd478a70320a9"} Mar 07 21:41:42.977626 master-0 kubenswrapper[16352]: I0307 21:41:42.977489 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:42.978533 master-0 kubenswrapper[16352]: I0307 21:41:42.978487 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:43.034038 master-0 kubenswrapper[16352]: I0307 21:41:43.029093 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v25l\" (UniqueName: \"kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l\") pod \"cinder-86971-db-sync-m7xht\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:43.037231 master-0 kubenswrapper[16352]: I0307 21:41:43.036958 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-97jz8"] Mar 07 21:41:43.042301 master-0 kubenswrapper[16352]: I0307 21:41:43.039257 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.052903 master-0 kubenswrapper[16352]: I0307 21:41:43.045528 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 07 21:41:43.052903 master-0 kubenswrapper[16352]: I0307 21:41:43.045831 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 07 21:41:43.058755 master-0 kubenswrapper[16352]: I0307 21:41:43.058643 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfbdc\" (UniqueName: \"kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc\") pod \"ironic-db-create-j9hg2\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:43.060636 master-0 kubenswrapper[16352]: I0307 21:41:43.060357 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-20ba-account-create-update-4dtlr"] Mar 07 21:41:43.066842 master-0 kubenswrapper[16352]: I0307 21:41:43.066713 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.067476 master-0 kubenswrapper[16352]: I0307 21:41:43.067103 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kwhp6\" (UniqueName: \"kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.067476 master-0 kubenswrapper[16352]: I0307 21:41:43.067189 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.071164 master-0 kubenswrapper[16352]: I0307 21:41:43.070575 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.077157 master-0 kubenswrapper[16352]: I0307 21:41:43.077122 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-db-secret" Mar 07 21:41:43.085126 master-0 kubenswrapper[16352]: I0307 21:41:43.085010 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:43.090884 master-0 kubenswrapper[16352]: I0307 21:41:43.090659 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-97jz8"] Mar 07 21:41:43.107757 master-0 kubenswrapper[16352]: I0307 21:41:43.107637 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-20ba-account-create-update-4dtlr"] Mar 07 21:41:43.136079 master-0 kubenswrapper[16352]: I0307 21:41:43.133857 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:43.140128 master-0 kubenswrapper[16352]: I0307 21:41:43.139641 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-4wkkv"] Mar 07 21:41:43.141089 master-0 kubenswrapper[16352]: E0307 21:41:43.141054 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="init" Mar 07 21:41:43.141089 master-0 kubenswrapper[16352]: I0307 21:41:43.141088 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="init" Mar 07 21:41:43.141184 master-0 kubenswrapper[16352]: E0307 21:41:43.141136 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="dnsmasq-dns" Mar 07 21:41:43.141184 master-0 kubenswrapper[16352]: I0307 21:41:43.141147 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="dnsmasq-dns" Mar 07 21:41:43.141477 master-0 kubenswrapper[16352]: I0307 21:41:43.141448 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" containerName="dnsmasq-dns" Mar 07 21:41:43.144219 master-0 kubenswrapper[16352]: I0307 21:41:43.144186 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.148399 master-0 kubenswrapper[16352]: I0307 21:41:43.148327 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 07 21:41:43.148782 master-0 kubenswrapper[16352]: I0307 21:41:43.148736 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 07 21:41:43.157134 master-0 kubenswrapper[16352]: I0307 21:41:43.157049 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:43.175117 master-0 kubenswrapper[16352]: I0307 21:41:43.174203 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.175117 master-0 kubenswrapper[16352]: I0307 21:41:43.174791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.175117 master-0 kubenswrapper[16352]: I0307 21:41:43.174999 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kwhp6\" (UniqueName: \"kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.183458 master-0 kubenswrapper[16352]: I0307 21:41:43.181939 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.184181 master-0 kubenswrapper[16352]: I0307 21:41:43.184125 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.190969 master-0 kubenswrapper[16352]: I0307 21:41:43.190531 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4wkkv"] Mar 07 21:41:43.221436 master-0 kubenswrapper[16352]: I0307 21:41:43.216771 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kwhp6\" (UniqueName: \"kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6\") pod \"neutron-db-sync-97jz8\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.256987 master-0 kubenswrapper[16352]: I0307 21:41:43.256912 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:41:43.268233 master-0 kubenswrapper[16352]: I0307 21:41:43.268180 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:41:43.276496 master-0 kubenswrapper[16352]: I0307 21:41:43.276425 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.276788 master-0 kubenswrapper[16352]: I0307 21:41:43.276760 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.277351 master-0 kubenswrapper[16352]: I0307 21:41:43.277324 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.278915 master-0 kubenswrapper[16352]: I0307 21:41:43.278692 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-chwch\" (UniqueName: \"kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.278967 master-0 kubenswrapper[16352]: I0307 21:41:43.278939 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.279006 master-0 kubenswrapper[16352]: I0307 21:41:43.278989 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc\") pod \"c4d36214-8911-4d50-b736-5984c5ec08b9\" (UID: \"c4d36214-8911-4d50-b736-5984c5ec08b9\") " Mar 07 21:41:43.281027 master-0 kubenswrapper[16352]: I0307 21:41:43.280825 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.281027 master-0 kubenswrapper[16352]: I0307 21:41:43.280993 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.281281 master-0 kubenswrapper[16352]: I0307 21:41:43.281040 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.281281 master-0 kubenswrapper[16352]: I0307 21:41:43.281100 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.284452 master-0 kubenswrapper[16352]: I0307 21:41:43.281905 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwtn\" (UniqueName: \"kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.284452 master-0 kubenswrapper[16352]: I0307 21:41:43.282010 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbxwk\" (UniqueName: \"kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.284452 master-0 kubenswrapper[16352]: I0307 21:41:43.282610 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.313709 master-0 kubenswrapper[16352]: I0307 21:41:43.299174 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch" (OuterVolumeSpecName: "kube-api-access-chwch") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "kube-api-access-chwch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:43.313709 master-0 kubenswrapper[16352]: I0307 21:41:43.302234 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.345414 master-0 kubenswrapper[16352]: I0307 21:41:43.337465 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:41:43.381128 master-0 kubenswrapper[16352]: I0307 21:41:43.379763 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97jz8" Mar 07 21:41:43.388136 master-0 kubenswrapper[16352]: I0307 21:41:43.384748 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svwtn\" (UniqueName: \"kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.398314 master-0 kubenswrapper[16352]: I0307 21:41:43.398209 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbxwk\" (UniqueName: \"kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.398556 master-0 kubenswrapper[16352]: I0307 21:41:43.398527 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.398599 master-0 kubenswrapper[16352]: I0307 21:41:43.398583 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.398707 master-0 kubenswrapper[16352]: I0307 21:41:43.398672 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.398759 master-0 kubenswrapper[16352]: I0307 21:41:43.398721 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.398794 master-0 kubenswrapper[16352]: I0307 21:41:43.398763 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.399247 master-0 kubenswrapper[16352]: I0307 21:41:43.399217 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-chwch\" (UniqueName: \"kubernetes.io/projected/c4d36214-8911-4d50-b736-5984c5ec08b9-kube-api-access-chwch\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.399741 master-0 kubenswrapper[16352]: I0307 21:41:43.399658 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.400336 master-0 kubenswrapper[16352]: I0307 21:41:43.400305 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.422798 master-0 kubenswrapper[16352]: I0307 21:41:43.422749 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.429129 master-0 kubenswrapper[16352]: I0307 21:41:43.429073 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.444737 master-0 kubenswrapper[16352]: I0307 21:41:43.444538 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.447644 master-0 kubenswrapper[16352]: I0307 21:41:43.446236 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbxwk\" (UniqueName: \"kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk\") pod \"placement-db-sync-4wkkv\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.449709 master-0 kubenswrapper[16352]: I0307 21:41:43.449647 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwtn\" (UniqueName: \"kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn\") pod \"ironic-20ba-account-create-update-4dtlr\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.480216 master-0 kubenswrapper[16352]: I0307 21:41:43.480125 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4wkkv" Mar 07 21:41:43.481519 master-0 kubenswrapper[16352]: I0307 21:41:43.481452 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:43.503024 master-0 kubenswrapper[16352]: I0307 21:41:43.502925 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503044 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503146 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503182 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503275 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgm2d\" (UniqueName: \"kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503327 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.503806 master-0 kubenswrapper[16352]: I0307 21:41:43.503486 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.504614 master-0 kubenswrapper[16352]: I0307 21:41:43.504539 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:43.524004 master-0 kubenswrapper[16352]: I0307 21:41:43.522939 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:43.538519 master-0 kubenswrapper[16352]: I0307 21:41:43.538433 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:43.572350 master-0 kubenswrapper[16352]: I0307 21:41:43.572257 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config" (OuterVolumeSpecName: "config") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:43.587736 master-0 kubenswrapper[16352]: I0307 21:41:43.587650 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c4d36214-8911-4d50-b736-5984c5ec08b9" (UID: "c4d36214-8911-4d50-b736-5984c5ec08b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.612127 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgm2d\" (UniqueName: \"kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.612465 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.613267 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.613328 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.616145 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.617118 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.617356 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.618079 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.618168 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.618888 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.618923 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.619062 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.619082 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c4d36214-8911-4d50-b736-5984c5ec08b9-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:43.620198 master-0 kubenswrapper[16352]: I0307 21:41:43.619930 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.621488 master-0 kubenswrapper[16352]: I0307 21:41:43.621008 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.715647 master-0 kubenswrapper[16352]: I0307 21:41:43.715547 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:43.788944 master-0 kubenswrapper[16352]: I0307 21:41:43.779698 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-m6pmp"] Mar 07 21:41:43.793071 master-0 kubenswrapper[16352]: I0307 21:41:43.792287 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgm2d\" (UniqueName: \"kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d\") pod \"dnsmasq-dns-7bbc6577f5-mldsh\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:43.800384 master-0 kubenswrapper[16352]: I0307 21:41:43.800344 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:44.020105 master-0 kubenswrapper[16352]: I0307 21:41:44.017584 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" event={"ID":"c4d36214-8911-4d50-b736-5984c5ec08b9","Type":"ContainerDied","Data":"14e763f09afb65fe26dbf38f80617e1b4b9d477ca16b37db1433680204a476d1"} Mar 07 21:41:44.020105 master-0 kubenswrapper[16352]: I0307 21:41:44.017669 16352 scope.go:117] "RemoveContainer" containerID="b7aa89c5fe579cebb0a26735d81d0685eba148e0dd013740ca1cd478a70320a9" Mar 07 21:41:44.020105 master-0 kubenswrapper[16352]: I0307 21:41:44.017884 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f5db5bd5-2tvbr" Mar 07 21:41:44.080903 master-0 kubenswrapper[16352]: I0307 21:41:44.071330 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6pmp" event={"ID":"2c8b935a-cbf8-4c36-918b-eb0d89edab86","Type":"ContainerStarted","Data":"443d3fbb82da10045674595b92d6df413bb31586370ba36c2534f8f6f6d60720"} Mar 07 21:41:44.126290 master-0 kubenswrapper[16352]: I0307 21:41:44.126195 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" event={"ID":"7785e25a-92dc-4f55-bbdf-ed970b2ff79d","Type":"ContainerStarted","Data":"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6"} Mar 07 21:41:44.126290 master-0 kubenswrapper[16352]: I0307 21:41:44.126278 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" event={"ID":"7785e25a-92dc-4f55-bbdf-ed970b2ff79d","Type":"ContainerStarted","Data":"a98548577813699f32f68b319e8edb0759ddcee060e0a3ec288be6fbd9aa4e17"} Mar 07 21:41:44.147375 master-0 kubenswrapper[16352]: I0307 21:41:44.145417 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-create-j9hg2"] Mar 07 21:41:44.173128 master-0 kubenswrapper[16352]: I0307 21:41:44.172363 16352 scope.go:117] "RemoveContainer" containerID="5df8e8a3264cd814944acefabeac04ebb9cbc7b8dfa8424a37692ecf6661a3ee" Mar 07 21:41:44.303029 master-0 kubenswrapper[16352]: I0307 21:41:44.302942 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:44.312533 master-0 kubenswrapper[16352]: I0307 21:41:44.312442 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f5db5bd5-2tvbr"] Mar 07 21:41:44.363732 master-0 kubenswrapper[16352]: I0307 21:41:44.363515 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-db-sync-m7xht"] Mar 07 21:41:44.375083 master-0 kubenswrapper[16352]: I0307 21:41:44.374978 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-97jz8"] Mar 07 21:41:44.390004 master-0 kubenswrapper[16352]: W0307 21:41:44.389916 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod218021bb_e4db_42b1_a553_f2a373cd9565.slice/crio-93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16 WatchSource:0}: Error finding container 93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16: Status 404 returned error can't find the container with id 93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16 Mar 07 21:41:44.392514 master-0 kubenswrapper[16352]: W0307 21:41:44.392370 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc17f59fb_df31_45d5_9077_ac10aa310af2.slice/crio-9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85 WatchSource:0}: Error finding container 9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85: Status 404 returned error can't find the container with id 9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85 Mar 07 21:41:44.433413 master-0 kubenswrapper[16352]: I0307 21:41:44.433261 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:44.436945 master-0 kubenswrapper[16352]: I0307 21:41:44.436886 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.441960 master-0 kubenswrapper[16352]: I0307 21:41:44.441916 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-external-config-data" Mar 07 21:41:44.442189 master-0 kubenswrapper[16352]: I0307 21:41:44.442119 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 07 21:41:44.442275 master-0 kubenswrapper[16352]: I0307 21:41:44.442253 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 21:41:44.453050 master-0 kubenswrapper[16352]: I0307 21:41:44.452911 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:44.475349 master-0 kubenswrapper[16352]: I0307 21:41:44.475170 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5x6fd\" (UniqueName: \"kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.475580 master-0 kubenswrapper[16352]: I0307 21:41:44.475353 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.475580 master-0 kubenswrapper[16352]: I0307 21:41:44.475408 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.475580 master-0 kubenswrapper[16352]: I0307 21:41:44.475460 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.475800 master-0 kubenswrapper[16352]: I0307 21:41:44.475743 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.476813 master-0 kubenswrapper[16352]: I0307 21:41:44.475816 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.476813 master-0 kubenswrapper[16352]: I0307 21:41:44.475857 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.476813 master-0 kubenswrapper[16352]: I0307 21:41:44.475890 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579218 master-0 kubenswrapper[16352]: I0307 21:41:44.578392 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579218 master-0 kubenswrapper[16352]: I0307 21:41:44.578485 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579218 master-0 kubenswrapper[16352]: I0307 21:41:44.578530 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579612 master-0 kubenswrapper[16352]: I0307 21:41:44.579558 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579882 master-0 kubenswrapper[16352]: I0307 21:41:44.579346 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.579882 master-0 kubenswrapper[16352]: I0307 21:41:44.579869 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5x6fd\" (UniqueName: \"kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.580161 master-0 kubenswrapper[16352]: I0307 21:41:44.580090 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.580257 master-0 kubenswrapper[16352]: I0307 21:41:44.580202 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.580379 master-0 kubenswrapper[16352]: I0307 21:41:44.580348 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.581370 master-0 kubenswrapper[16352]: I0307 21:41:44.581038 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.588319 master-0 kubenswrapper[16352]: I0307 21:41:44.586733 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:41:44.588319 master-0 kubenswrapper[16352]: I0307 21:41:44.586820 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a513f999ca477ba7af8fd57b1445c957c7136c73e46ac94a843087871d1d0d27/globalmount\"" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.588319 master-0 kubenswrapper[16352]: I0307 21:41:44.587081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.591732 master-0 kubenswrapper[16352]: I0307 21:41:44.590174 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.591732 master-0 kubenswrapper[16352]: I0307 21:41:44.591603 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.595103 master-0 kubenswrapper[16352]: I0307 21:41:44.595039 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.608049 master-0 kubenswrapper[16352]: I0307 21:41:44.607991 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5x6fd\" (UniqueName: \"kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:44.964063 master-0 kubenswrapper[16352]: I0307 21:41:44.963967 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:41:44.980112 master-0 kubenswrapper[16352]: I0307 21:41:44.980023 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-20ba-account-create-update-4dtlr"] Mar 07 21:41:44.981397 master-0 kubenswrapper[16352]: W0307 21:41:44.981352 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a84aa45_9fea_4aaa_8e68_500d08c4f625.slice/crio-893676acaf7cdf2a2f3b3c8d64661fa8e2c100e5b3b776e2a53ce289aab9b21e WatchSource:0}: Error finding container 893676acaf7cdf2a2f3b3c8d64661fa8e2c100e5b3b776e2a53ce289aab9b21e: Status 404 returned error can't find the container with id 893676acaf7cdf2a2f3b3c8d64661fa8e2c100e5b3b776e2a53ce289aab9b21e Mar 07 21:41:45.000716 master-0 kubenswrapper[16352]: W0307 21:41:44.999054 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63cde_c147_4e68_8491_753368687501.slice/crio-a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5 WatchSource:0}: Error finding container a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5: Status 404 returned error can't find the container with id a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5 Mar 07 21:41:45.010998 master-0 kubenswrapper[16352]: I0307 21:41:45.010855 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-4wkkv"] Mar 07 21:41:45.076937 master-0 kubenswrapper[16352]: I0307 21:41:45.076870 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:45.155587 master-0 kubenswrapper[16352]: I0307 21:41:45.155322 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6pmp" event={"ID":"2c8b935a-cbf8-4c36-918b-eb0d89edab86","Type":"ContainerStarted","Data":"fc994383f7876be1e38177029dc4289405bb594f54e4d51c6a4f561d4a6fb893"} Mar 07 21:41:45.161329 master-0 kubenswrapper[16352]: I0307 21:41:45.161280 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" event={"ID":"6a84aa45-9fea-4aaa-8e68-500d08c4f625","Type":"ContainerStarted","Data":"893676acaf7cdf2a2f3b3c8d64661fa8e2c100e5b3b776e2a53ce289aab9b21e"} Mar 07 21:41:45.167406 master-0 kubenswrapper[16352]: I0307 21:41:45.166570 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4wkkv" event={"ID":"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985","Type":"ContainerStarted","Data":"4038a04d366ac1e66ab753c6404d18fab589dd8d8f50eb29e1337922fa518982"} Mar 07 21:41:45.187136 master-0 kubenswrapper[16352]: I0307 21:41:45.187032 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-m6pmp" podStartSLOduration=3.187008349 podStartE2EDuration="3.187008349s" podCreationTimestamp="2026-03-07 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:45.184989721 +0000 UTC m=+1428.255694780" watchObservedRunningTime="2026-03-07 21:41:45.187008349 +0000 UTC m=+1428.257713408" Mar 07 21:41:45.207785 master-0 kubenswrapper[16352]: I0307 21:41:45.207694 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.208044 master-0 kubenswrapper[16352]: I0307 21:41:45.207890 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.208044 master-0 kubenswrapper[16352]: I0307 21:41:45.207933 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.208044 master-0 kubenswrapper[16352]: I0307 21:41:45.207998 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-68qpz\" (UniqueName: \"kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.208186 master-0 kubenswrapper[16352]: I0307 21:41:45.208088 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.208245 master-0 kubenswrapper[16352]: I0307 21:41:45.208218 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb\") pod \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\" (UID: \"7785e25a-92dc-4f55-bbdf-ed970b2ff79d\") " Mar 07 21:41:45.216823 master-0 kubenswrapper[16352]: I0307 21:41:45.216746 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz" (OuterVolumeSpecName: "kube-api-access-68qpz") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "kube-api-access-68qpz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:45.217092 master-0 kubenswrapper[16352]: I0307 21:41:45.217023 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4d36214-8911-4d50-b736-5984c5ec08b9" path="/var/lib/kubelet/pods/c4d36214-8911-4d50-b736-5984c5ec08b9/volumes" Mar 07 21:41:45.226398 master-0 kubenswrapper[16352]: I0307 21:41:45.226050 16352 generic.go:334] "Generic (PLEG): container finished" podID="673480b0-be7b-453c-b7b3-8646042b3e59" containerID="c84f49f70e57b95ce5538ee514920d8345a901e6861c316e5479d3149141e5f4" exitCode=0 Mar 07 21:41:45.230668 master-0 kubenswrapper[16352]: I0307 21:41:45.230214 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-20ba-account-create-update-4dtlr" event={"ID":"d3d63cde-c147-4e68-8491-753368687501","Type":"ContainerStarted","Data":"a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5"} Mar 07 21:41:45.230668 master-0 kubenswrapper[16352]: I0307 21:41:45.230267 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-db-sync-m7xht" event={"ID":"c17f59fb-df31-45d5-9077-ac10aa310af2","Type":"ContainerStarted","Data":"9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85"} Mar 07 21:41:45.230668 master-0 kubenswrapper[16352]: I0307 21:41:45.230282 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-j9hg2" event={"ID":"673480b0-be7b-453c-b7b3-8646042b3e59","Type":"ContainerDied","Data":"c84f49f70e57b95ce5538ee514920d8345a901e6861c316e5479d3149141e5f4"} Mar 07 21:41:45.230668 master-0 kubenswrapper[16352]: I0307 21:41:45.230300 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-j9hg2" event={"ID":"673480b0-be7b-453c-b7b3-8646042b3e59","Type":"ContainerStarted","Data":"f26642a0a34d4dd316be2f06eee1ec4ddcdc84dd22ec11d087bb50eb0fb1820e"} Mar 07 21:41:45.243352 master-0 kubenswrapper[16352]: I0307 21:41:45.243003 16352 generic.go:334] "Generic (PLEG): container finished" podID="7785e25a-92dc-4f55-bbdf-ed970b2ff79d" containerID="4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6" exitCode=0 Mar 07 21:41:45.243352 master-0 kubenswrapper[16352]: I0307 21:41:45.243120 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" Mar 07 21:41:45.243352 master-0 kubenswrapper[16352]: I0307 21:41:45.243256 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" event={"ID":"7785e25a-92dc-4f55-bbdf-ed970b2ff79d","Type":"ContainerDied","Data":"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6"} Mar 07 21:41:45.243528 master-0 kubenswrapper[16352]: I0307 21:41:45.243404 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5787b6ddf7-gjnck" event={"ID":"7785e25a-92dc-4f55-bbdf-ed970b2ff79d","Type":"ContainerDied","Data":"a98548577813699f32f68b319e8edb0759ddcee060e0a3ec288be6fbd9aa4e17"} Mar 07 21:41:45.243528 master-0 kubenswrapper[16352]: I0307 21:41:45.243448 16352 scope.go:117] "RemoveContainer" containerID="4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6" Mar 07 21:41:45.256870 master-0 kubenswrapper[16352]: I0307 21:41:45.256747 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:45.259870 master-0 kubenswrapper[16352]: I0307 21:41:45.259803 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97jz8" event={"ID":"218021bb-e4db-42b1-a553-f2a373cd9565","Type":"ContainerStarted","Data":"8e7336a0eb6ab818f0c8d258f43028b48d1c7900ff93861475b25e2a1a77ecd8"} Mar 07 21:41:45.260003 master-0 kubenswrapper[16352]: I0307 21:41:45.259947 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97jz8" event={"ID":"218021bb-e4db-42b1-a553-f2a373cd9565","Type":"ContainerStarted","Data":"93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16"} Mar 07 21:41:45.289707 master-0 kubenswrapper[16352]: I0307 21:41:45.289259 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-97jz8" podStartSLOduration=3.289230403 podStartE2EDuration="3.289230403s" podCreationTimestamp="2026-03-07 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:45.278411823 +0000 UTC m=+1428.349116892" watchObservedRunningTime="2026-03-07 21:41:45.289230403 +0000 UTC m=+1428.359935462" Mar 07 21:41:45.313234 master-0 kubenswrapper[16352]: I0307 21:41:45.312452 16352 scope.go:117] "RemoveContainer" containerID="4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6" Mar 07 21:41:45.313234 master-0 kubenswrapper[16352]: E0307 21:41:45.313118 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6\": container with ID starting with 4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6 not found: ID does not exist" containerID="4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6" Mar 07 21:41:45.313234 master-0 kubenswrapper[16352]: I0307 21:41:45.313160 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6"} err="failed to get container status \"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6\": rpc error: code = NotFound desc = could not find container \"4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6\": container with ID starting with 4fab4f24167f0bc1bbaf9925cf3b825e2232fd30de10d9f50bb0b5b94e750ad6 not found: ID does not exist" Mar 07 21:41:45.314790 master-0 kubenswrapper[16352]: I0307 21:41:45.314750 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-68qpz\" (UniqueName: \"kubernetes.io/projected/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-kube-api-access-68qpz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.314790 master-0 kubenswrapper[16352]: I0307 21:41:45.314786 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.355347 master-0 kubenswrapper[16352]: I0307 21:41:45.355282 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:45.355612 master-0 kubenswrapper[16352]: I0307 21:41:45.355354 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config" (OuterVolumeSpecName: "config") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:45.363870 master-0 kubenswrapper[16352]: I0307 21:41:45.363770 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:45.387798 master-0 kubenswrapper[16352]: I0307 21:41:45.387705 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7785e25a-92dc-4f55-bbdf-ed970b2ff79d" (UID: "7785e25a-92dc-4f55-bbdf-ed970b2ff79d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:45.418177 master-0 kubenswrapper[16352]: I0307 21:41:45.418089 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.418430 master-0 kubenswrapper[16352]: I0307 21:41:45.418196 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.418430 master-0 kubenswrapper[16352]: I0307 21:41:45.418210 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.418430 master-0 kubenswrapper[16352]: I0307 21:41:45.418223 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7785e25a-92dc-4f55-bbdf-ed970b2ff79d-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:45.581668 master-0 kubenswrapper[16352]: I0307 21:41:45.579102 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:45.581668 master-0 kubenswrapper[16352]: E0307 21:41:45.579908 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7785e25a-92dc-4f55-bbdf-ed970b2ff79d" containerName="init" Mar 07 21:41:45.581668 master-0 kubenswrapper[16352]: I0307 21:41:45.579958 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="7785e25a-92dc-4f55-bbdf-ed970b2ff79d" containerName="init" Mar 07 21:41:45.581668 master-0 kubenswrapper[16352]: I0307 21:41:45.580340 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="7785e25a-92dc-4f55-bbdf-ed970b2ff79d" containerName="init" Mar 07 21:41:45.582716 master-0 kubenswrapper[16352]: I0307 21:41:45.582650 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.592244 master-0 kubenswrapper[16352]: I0307 21:41:45.592029 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-internal-config-data" Mar 07 21:41:45.592244 master-0 kubenswrapper[16352]: I0307 21:41:45.592116 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 21:41:45.605156 master-0 kubenswrapper[16352]: I0307 21:41:45.605085 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:45.724537 master-0 kubenswrapper[16352]: I0307 21:41:45.724466 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724561 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724596 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724656 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmdkp\" (UniqueName: \"kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724715 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724855 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.724930 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.725112 master-0 kubenswrapper[16352]: I0307 21:41:45.725014 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.826968 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827072 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827125 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827284 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827355 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827410 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827494 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmdkp\" (UniqueName: \"kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.828436 master-0 kubenswrapper[16352]: I0307 21:41:45.827547 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.830309 master-0 kubenswrapper[16352]: I0307 21:41:45.829076 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.835589 master-0 kubenswrapper[16352]: I0307 21:41:45.834831 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:41:45.835589 master-0 kubenswrapper[16352]: I0307 21:41:45.834869 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/26e6a054227d2ae645fb0f70048c6b35076d5abdd3e58247e88864732765f6e0/globalmount\"" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.837059 master-0 kubenswrapper[16352]: I0307 21:41:45.836994 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.837464 master-0 kubenswrapper[16352]: I0307 21:41:45.837369 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.900842 master-0 kubenswrapper[16352]: I0307 21:41:45.900741 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.907603 master-0 kubenswrapper[16352]: I0307 21:41:45.904864 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.907603 master-0 kubenswrapper[16352]: I0307 21:41:45.905839 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.920780 master-0 kubenswrapper[16352]: I0307 21:41:45.918517 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmdkp\" (UniqueName: \"kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:45.944288 master-0 kubenswrapper[16352]: I0307 21:41:45.942867 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:45.961493 master-0 kubenswrapper[16352]: I0307 21:41:45.960081 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:45.964927 master-0 kubenswrapper[16352]: E0307 21:41:45.964839 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-213eb-default-external-api-0" podUID="6745f880-a5c0-4d62-9182-ee075ed0d212" Mar 07 21:41:45.975378 master-0 kubenswrapper[16352]: I0307 21:41:45.974431 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5787b6ddf7-gjnck"] Mar 07 21:41:46.039728 master-0 kubenswrapper[16352]: I0307 21:41:46.030244 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:46.039728 master-0 kubenswrapper[16352]: E0307 21:41:46.032461 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[glance], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openstack/glance-213eb-default-internal-api-0" podUID="db8b9f34-b623-40eb-94be-0ad5eda49df9" Mar 07 21:41:46.155723 master-0 kubenswrapper[16352]: I0307 21:41:46.154768 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:46.354338 master-0 kubenswrapper[16352]: I0307 21:41:46.354216 16352 generic.go:334] "Generic (PLEG): container finished" podID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerID="cd39d2635e34ca1b607a0d7f795b3d69ae5bebd4d892a8aa9dd45224acee1f66" exitCode=0 Mar 07 21:41:46.355533 master-0 kubenswrapper[16352]: I0307 21:41:46.354952 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" event={"ID":"6a84aa45-9fea-4aaa-8e68-500d08c4f625","Type":"ContainerDied","Data":"cd39d2635e34ca1b607a0d7f795b3d69ae5bebd4d892a8aa9dd45224acee1f66"} Mar 07 21:41:46.372763 master-0 kubenswrapper[16352]: I0307 21:41:46.366168 16352 generic.go:334] "Generic (PLEG): container finished" podID="d3d63cde-c147-4e68-8491-753368687501" containerID="d91ddbb60f350d9a8a522f80755adc3279c6c3272b25bb812f758dde17d24b22" exitCode=0 Mar 07 21:41:46.372763 master-0 kubenswrapper[16352]: I0307 21:41:46.366302 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-20ba-account-create-update-4dtlr" event={"ID":"d3d63cde-c147-4e68-8491-753368687501","Type":"ContainerDied","Data":"d91ddbb60f350d9a8a522f80755adc3279c6c3272b25bb812f758dde17d24b22"} Mar 07 21:41:46.389275 master-0 kubenswrapper[16352]: I0307 21:41:46.387514 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:46.389275 master-0 kubenswrapper[16352]: I0307 21:41:46.388661 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:46.684619 master-0 kubenswrapper[16352]: I0307 21:41:46.684261 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:46.700286 master-0 kubenswrapper[16352]: I0307 21:41:46.690728 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:46.797994 master-0 kubenswrapper[16352]: I0307 21:41:46.797812 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.797994 master-0 kubenswrapper[16352]: I0307 21:41:46.797884 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.797994 master-0 kubenswrapper[16352]: I0307 21:41:46.797955 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798031 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798088 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798178 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798313 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798373 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmdkp\" (UniqueName: \"kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.798437 master-0 kubenswrapper[16352]: I0307 21:41:46.798439 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.800241 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.802499 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data" (OuterVolumeSpecName: "config-data") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.805036 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.805101 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.806879 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.807043 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5x6fd\" (UniqueName: \"kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.807122 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.807162 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.807235 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts\") pod \"6745f880-a5c0-4d62-9182-ee075ed0d212\" (UID: \"6745f880-a5c0-4d62-9182-ee075ed0d212\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.807371 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.808305 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs" (OuterVolumeSpecName: "logs") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810775 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810831 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810862 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810878 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810889 16352 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.810903 16352 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6745f880-a5c0-4d62-9182-ee075ed0d212-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.811923 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp" (OuterVolumeSpecName: "kube-api-access-vmdkp") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "kube-api-access-vmdkp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.812113 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts" (OuterVolumeSpecName: "scripts") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.813015 master-0 kubenswrapper[16352]: I0307 21:41:46.812874 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts" (OuterVolumeSpecName: "scripts") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.820709 master-0 kubenswrapper[16352]: I0307 21:41:46.814644 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd" (OuterVolumeSpecName: "kube-api-access-5x6fd") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "kube-api-access-5x6fd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:46.820709 master-0 kubenswrapper[16352]: I0307 21:41:46.814936 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data" (OuterVolumeSpecName: "config-data") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.820709 master-0 kubenswrapper[16352]: I0307 21:41:46.816071 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs" (OuterVolumeSpecName: "logs") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:41:46.837714 master-0 kubenswrapper[16352]: I0307 21:41:46.829637 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.853726 master-0 kubenswrapper[16352]: I0307 21:41:46.850058 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917126 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5x6fd\" (UniqueName: \"kubernetes.io/projected/6745f880-a5c0-4d62-9182-ee075ed0d212-kube-api-access-5x6fd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917176 16352 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917187 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917198 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/db8b9f34-b623-40eb-94be-0ad5eda49df9-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917208 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917202 master-0 kubenswrapper[16352]: I0307 21:41:46.917217 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6745f880-a5c0-4d62-9182-ee075ed0d212-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917802 master-0 kubenswrapper[16352]: I0307 21:41:46.917227 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917802 master-0 kubenswrapper[16352]: I0307 21:41:46.917239 16352 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/db8b9f34-b623-40eb-94be-0ad5eda49df9-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.917802 master-0 kubenswrapper[16352]: I0307 21:41:46.917249 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmdkp\" (UniqueName: \"kubernetes.io/projected/db8b9f34-b623-40eb-94be-0ad5eda49df9-kube-api-access-vmdkp\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:46.924701 master-0 kubenswrapper[16352]: I0307 21:41:46.924636 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:47.021130 master-0 kubenswrapper[16352]: I0307 21:41:47.019519 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfbdc\" (UniqueName: \"kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc\") pod \"673480b0-be7b-453c-b7b3-8646042b3e59\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " Mar 07 21:41:47.021130 master-0 kubenswrapper[16352]: I0307 21:41:47.019608 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts\") pod \"673480b0-be7b-453c-b7b3-8646042b3e59\" (UID: \"673480b0-be7b-453c-b7b3-8646042b3e59\") " Mar 07 21:41:47.021130 master-0 kubenswrapper[16352]: I0307 21:41:47.020509 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "673480b0-be7b-453c-b7b3-8646042b3e59" (UID: "673480b0-be7b-453c-b7b3-8646042b3e59"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:47.026589 master-0 kubenswrapper[16352]: I0307 21:41:47.026530 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc" (OuterVolumeSpecName: "kube-api-access-sfbdc") pod "673480b0-be7b-453c-b7b3-8646042b3e59" (UID: "673480b0-be7b-453c-b7b3-8646042b3e59"). InnerVolumeSpecName "kube-api-access-sfbdc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:47.123223 master-0 kubenswrapper[16352]: I0307 21:41:47.123104 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfbdc\" (UniqueName: \"kubernetes.io/projected/673480b0-be7b-453c-b7b3-8646042b3e59-kube-api-access-sfbdc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:47.123223 master-0 kubenswrapper[16352]: I0307 21:41:47.123163 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/673480b0-be7b-453c-b7b3-8646042b3e59-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:47.210588 master-0 kubenswrapper[16352]: I0307 21:41:47.210427 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7785e25a-92dc-4f55-bbdf-ed970b2ff79d" path="/var/lib/kubelet/pods/7785e25a-92dc-4f55-bbdf-ed970b2ff79d/volumes" Mar 07 21:41:47.409984 master-0 kubenswrapper[16352]: I0307 21:41:47.409511 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" event={"ID":"6a84aa45-9fea-4aaa-8e68-500d08c4f625","Type":"ContainerStarted","Data":"e74f55a7aaa366ef6f2d2160a993e43cd45ad2b9ce7ad66324b3aa5c5125bc27"} Mar 07 21:41:47.409984 master-0 kubenswrapper[16352]: I0307 21:41:47.409611 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:47.417107 master-0 kubenswrapper[16352]: I0307 21:41:47.416987 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-create-j9hg2" Mar 07 21:41:47.417107 master-0 kubenswrapper[16352]: I0307 21:41:47.417017 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-create-j9hg2" event={"ID":"673480b0-be7b-453c-b7b3-8646042b3e59","Type":"ContainerDied","Data":"f26642a0a34d4dd316be2f06eee1ec4ddcdc84dd22ec11d087bb50eb0fb1820e"} Mar 07 21:41:47.417107 master-0 kubenswrapper[16352]: I0307 21:41:47.417052 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f26642a0a34d4dd316be2f06eee1ec4ddcdc84dd22ec11d087bb50eb0fb1820e" Mar 07 21:41:47.417107 master-0 kubenswrapper[16352]: I0307 21:41:47.416986 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.417107 master-0 kubenswrapper[16352]: I0307 21:41:47.417116 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:47.436601 master-0 kubenswrapper[16352]: I0307 21:41:47.436512 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" podStartSLOduration=5.436489336 podStartE2EDuration="5.436489336s" podCreationTimestamp="2026-03-07 21:41:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:47.433246358 +0000 UTC m=+1430.503951427" watchObservedRunningTime="2026-03-07 21:41:47.436489336 +0000 UTC m=+1430.507194395" Mar 07 21:41:47.545278 master-0 kubenswrapper[16352]: I0307 21:41:47.537041 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:47.545278 master-0 kubenswrapper[16352]: I0307 21:41:47.537128 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:47.584458 master-0 kubenswrapper[16352]: I0307 21:41:47.584395 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:47.585127 master-0 kubenswrapper[16352]: E0307 21:41:47.585106 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="673480b0-be7b-453c-b7b3-8646042b3e59" containerName="mariadb-database-create" Mar 07 21:41:47.585208 master-0 kubenswrapper[16352]: I0307 21:41:47.585128 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="673480b0-be7b-453c-b7b3-8646042b3e59" containerName="mariadb-database-create" Mar 07 21:41:47.585440 master-0 kubenswrapper[16352]: I0307 21:41:47.585421 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="673480b0-be7b-453c-b7b3-8646042b3e59" containerName="mariadb-database-create" Mar 07 21:41:47.588740 master-0 kubenswrapper[16352]: I0307 21:41:47.588358 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.593201 master-0 kubenswrapper[16352]: I0307 21:41:47.591423 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 21:41:47.593201 master-0 kubenswrapper[16352]: I0307 21:41:47.591590 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-internal-config-data" Mar 07 21:41:47.602277 master-0 kubenswrapper[16352]: I0307 21:41:47.602176 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673204 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673267 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673320 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk9bt\" (UniqueName: \"kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673393 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673421 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673442 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.673756 master-0 kubenswrapper[16352]: I0307 21:41:47.673466 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.729377 master-0 kubenswrapper[16352]: I0307 21:41:47.728764 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2" (OuterVolumeSpecName: "glance") pod "6745f880-a5c0-4d62-9182-ee075ed0d212" (UID: "6745f880-a5c0-4d62-9182-ee075ed0d212"). InnerVolumeSpecName "pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 21:41:47.767482 master-0 kubenswrapper[16352]: I0307 21:41:47.767386 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.775837 master-0 kubenswrapper[16352]: I0307 21:41:47.775787 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"db8b9f34-b623-40eb-94be-0ad5eda49df9\" (UID: \"db8b9f34-b623-40eb-94be-0ad5eda49df9\") " Mar 07 21:41:47.777127 master-0 kubenswrapper[16352]: I0307 21:41:47.777078 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.777267 master-0 kubenswrapper[16352]: I0307 21:41:47.777251 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.777435 master-0 kubenswrapper[16352]: I0307 21:41:47.777410 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk9bt\" (UniqueName: \"kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.777703 master-0 kubenswrapper[16352]: I0307 21:41:47.777665 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.778041 master-0 kubenswrapper[16352]: I0307 21:41:47.778021 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.778136 master-0 kubenswrapper[16352]: I0307 21:41:47.778123 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.780630 master-0 kubenswrapper[16352]: I0307 21:41:47.780608 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.781963 master-0 kubenswrapper[16352]: I0307 21:41:47.781911 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.782704 master-0 kubenswrapper[16352]: I0307 21:41:47.782662 16352 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") on node \"master-0\" " Mar 07 21:41:47.783141 master-0 kubenswrapper[16352]: I0307 21:41:47.783007 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.786047 master-0 kubenswrapper[16352]: I0307 21:41:47.786027 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.788865 master-0 kubenswrapper[16352]: I0307 21:41:47.788362 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.789060 master-0 kubenswrapper[16352]: I0307 21:41:47.789015 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.807753 master-0 kubenswrapper[16352]: I0307 21:41:47.807612 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e" (OuterVolumeSpecName: "glance") pod "db8b9f34-b623-40eb-94be-0ad5eda49df9" (UID: "db8b9f34-b623-40eb-94be-0ad5eda49df9"). InnerVolumeSpecName "pvc-2828e4cd-2480-4309-bb23-a8e5342365ce". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 21:41:47.809158 master-0 kubenswrapper[16352]: I0307 21:41:47.809071 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.815791 master-0 kubenswrapper[16352]: I0307 21:41:47.815647 16352 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 07 21:41:47.816067 master-0 kubenswrapper[16352]: I0307 21:41:47.815849 16352 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3" (UniqueName: "kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2") on node "master-0" Mar 07 21:41:47.887198 master-0 kubenswrapper[16352]: I0307 21:41:47.886994 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:47.887532 master-0 kubenswrapper[16352]: I0307 21:41:47.887293 16352 reconciler_common.go:293] "Volume detached for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:47.923464 master-0 kubenswrapper[16352]: I0307 21:41:47.923403 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk9bt\" (UniqueName: \"kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:48.150882 master-0 kubenswrapper[16352]: I0307 21:41:48.148876 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:48.162875 master-0 kubenswrapper[16352]: I0307 21:41:48.162703 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:48.174817 master-0 kubenswrapper[16352]: I0307 21:41:48.174739 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:48.183508 master-0 kubenswrapper[16352]: I0307 21:41:48.179865 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.183662 master-0 kubenswrapper[16352]: I0307 21:41:48.183578 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 21:41:48.183804 master-0 kubenswrapper[16352]: I0307 21:41:48.183749 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-external-config-data" Mar 07 21:41:48.200502 master-0 kubenswrapper[16352]: I0307 21:41:48.200423 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.300986 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.301083 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.301111 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-644h9\" (UniqueName: \"kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.301133 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.301179 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301214 master-0 kubenswrapper[16352]: I0307 21:41:48.301214 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301761 master-0 kubenswrapper[16352]: I0307 21:41:48.301244 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.301761 master-0 kubenswrapper[16352]: I0307 21:41:48.301261 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.406097 master-0 kubenswrapper[16352]: I0307 21:41:48.405970 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.406390 master-0 kubenswrapper[16352]: I0307 21:41:48.406168 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.406390 master-0 kubenswrapper[16352]: I0307 21:41:48.406305 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.406480 master-0 kubenswrapper[16352]: I0307 21:41:48.406345 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.407931 master-0 kubenswrapper[16352]: I0307 21:41:48.406628 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.407931 master-0 kubenswrapper[16352]: I0307 21:41:48.406762 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.407931 master-0 kubenswrapper[16352]: I0307 21:41:48.406817 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-644h9\" (UniqueName: \"kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.407931 master-0 kubenswrapper[16352]: I0307 21:41:48.406857 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.411955 master-0 kubenswrapper[16352]: I0307 21:41:48.411640 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.412448 master-0 kubenswrapper[16352]: I0307 21:41:48.412408 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.413275 master-0 kubenswrapper[16352]: I0307 21:41:48.413147 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:41:48.413275 master-0 kubenswrapper[16352]: I0307 21:41:48.413189 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a513f999ca477ba7af8fd57b1445c957c7136c73e46ac94a843087871d1d0d27/globalmount\"" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.413275 master-0 kubenswrapper[16352]: I0307 21:41:48.413221 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.415163 master-0 kubenswrapper[16352]: I0307 21:41:48.415049 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.418066 master-0 kubenswrapper[16352]: I0307 21:41:48.418022 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.427065 master-0 kubenswrapper[16352]: I0307 21:41:48.427008 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:48.434379 master-0 kubenswrapper[16352]: I0307 21:41:48.434330 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-644h9\" (UniqueName: \"kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:49.126726 master-0 kubenswrapper[16352]: I0307 21:41:49.126643 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:49.149876 master-0 kubenswrapper[16352]: I0307 21:41:49.149792 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:49.205578 master-0 kubenswrapper[16352]: I0307 21:41:49.205272 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6745f880-a5c0-4d62-9182-ee075ed0d212" path="/var/lib/kubelet/pods/6745f880-a5c0-4d62-9182-ee075ed0d212/volumes" Mar 07 21:41:49.206334 master-0 kubenswrapper[16352]: I0307 21:41:49.206295 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db8b9f34-b623-40eb-94be-0ad5eda49df9" path="/var/lib/kubelet/pods/db8b9f34-b623-40eb-94be-0ad5eda49df9/volumes" Mar 07 21:41:50.391539 master-0 kubenswrapper[16352]: I0307 21:41:50.391474 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:50.460792 master-0 kubenswrapper[16352]: I0307 21:41:50.460665 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:50.476608 master-0 kubenswrapper[16352]: I0307 21:41:50.476529 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts\") pod \"d3d63cde-c147-4e68-8491-753368687501\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " Mar 07 21:41:50.476994 master-0 kubenswrapper[16352]: I0307 21:41:50.476953 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svwtn\" (UniqueName: \"kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn\") pod \"d3d63cde-c147-4e68-8491-753368687501\" (UID: \"d3d63cde-c147-4e68-8491-753368687501\") " Mar 07 21:41:50.477203 master-0 kubenswrapper[16352]: I0307 21:41:50.477151 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d3d63cde-c147-4e68-8491-753368687501" (UID: "d3d63cde-c147-4e68-8491-753368687501"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:41:50.477924 master-0 kubenswrapper[16352]: I0307 21:41:50.477872 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-20ba-account-create-update-4dtlr" event={"ID":"d3d63cde-c147-4e68-8491-753368687501","Type":"ContainerDied","Data":"a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5"} Mar 07 21:41:50.478006 master-0 kubenswrapper[16352]: I0307 21:41:50.477929 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a0acb5213a4702c7fa0ec09a51184681814cce4c540330288544a4751a7b03f5" Mar 07 21:41:50.478006 master-0 kubenswrapper[16352]: I0307 21:41:50.477901 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-20ba-account-create-update-4dtlr" Mar 07 21:41:50.479207 master-0 kubenswrapper[16352]: I0307 21:41:50.479137 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d3d63cde-c147-4e68-8491-753368687501-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:50.482350 master-0 kubenswrapper[16352]: I0307 21:41:50.482255 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn" (OuterVolumeSpecName: "kube-api-access-svwtn") pod "d3d63cde-c147-4e68-8491-753368687501" (UID: "d3d63cde-c147-4e68-8491-753368687501"). InnerVolumeSpecName "kube-api-access-svwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:50.483113 master-0 kubenswrapper[16352]: I0307 21:41:50.483068 16352 generic.go:334] "Generic (PLEG): container finished" podID="2c8b935a-cbf8-4c36-918b-eb0d89edab86" containerID="fc994383f7876be1e38177029dc4289405bb594f54e4d51c6a4f561d4a6fb893" exitCode=0 Mar 07 21:41:50.483176 master-0 kubenswrapper[16352]: I0307 21:41:50.483113 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6pmp" event={"ID":"2c8b935a-cbf8-4c36-918b-eb0d89edab86","Type":"ContainerDied","Data":"fc994383f7876be1e38177029dc4289405bb594f54e4d51c6a4f561d4a6fb893"} Mar 07 21:41:50.581621 master-0 kubenswrapper[16352]: I0307 21:41:50.581543 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svwtn\" (UniqueName: \"kubernetes.io/projected/d3d63cde-c147-4e68-8491-753368687501-kube-api-access-svwtn\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:50.623469 master-0 kubenswrapper[16352]: I0307 21:41:50.623362 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:41:50.708888 master-0 kubenswrapper[16352]: W0307 21:41:50.708669 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34febdc7_58ae_4ec2_a8f3_92011ca01d81.slice/crio-53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf WatchSource:0}: Error finding container 53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf: Status 404 returned error can't find the container with id 53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf Mar 07 21:41:50.729806 master-0 kubenswrapper[16352]: I0307 21:41:50.729757 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:41:50.884824 master-0 kubenswrapper[16352]: E0307 21:41:50.884718 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3d63cde_c147_4e68_8491_753368687501.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:41:51.229353 master-0 kubenswrapper[16352]: I0307 21:41:51.229283 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:41:51.233531 master-0 kubenswrapper[16352]: W0307 21:41:51.233371 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf623599e_9cea_49ec_a621_f676a75574f9.slice/crio-707ceeffb2d395e854b73ef5f636b608a948fcf872762451c33b86b6d8dfdc22 WatchSource:0}: Error finding container 707ceeffb2d395e854b73ef5f636b608a948fcf872762451c33b86b6d8dfdc22: Status 404 returned error can't find the container with id 707ceeffb2d395e854b73ef5f636b608a948fcf872762451c33b86b6d8dfdc22 Mar 07 21:41:51.497242 master-0 kubenswrapper[16352]: I0307 21:41:51.497080 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerStarted","Data":"707ceeffb2d395e854b73ef5f636b608a948fcf872762451c33b86b6d8dfdc22"} Mar 07 21:41:51.500789 master-0 kubenswrapper[16352]: I0307 21:41:51.500671 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerStarted","Data":"58b019180d18af9978ed984fa8ea3f3388b3fd37aa0ec168b38ef005be8346d2"} Mar 07 21:41:51.500789 master-0 kubenswrapper[16352]: I0307 21:41:51.500769 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerStarted","Data":"53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf"} Mar 07 21:41:51.509493 master-0 kubenswrapper[16352]: I0307 21:41:51.508065 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4wkkv" event={"ID":"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985","Type":"ContainerStarted","Data":"2290b63b0ddc2e5a9d606527b15e706a487fb1316d1c8f0fb7ec06686188f238"} Mar 07 21:41:51.535851 master-0 kubenswrapper[16352]: I0307 21:41:51.535750 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-4wkkv" podStartSLOduration=4.423331756 podStartE2EDuration="9.535720373s" podCreationTimestamp="2026-03-07 21:41:42 +0000 UTC" firstStartedPulling="2026-03-07 21:41:44.991848875 +0000 UTC m=+1428.062553934" lastFinishedPulling="2026-03-07 21:41:50.104237492 +0000 UTC m=+1433.174942551" observedRunningTime="2026-03-07 21:41:51.531244196 +0000 UTC m=+1434.601949265" watchObservedRunningTime="2026-03-07 21:41:51.535720373 +0000 UTC m=+1434.606425432" Mar 07 21:41:52.133029 master-0 kubenswrapper[16352]: I0307 21:41:52.132949 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:52.224723 master-0 kubenswrapper[16352]: I0307 21:41:52.224630 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qz65x\" (UniqueName: \"kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.224723 master-0 kubenswrapper[16352]: I0307 21:41:52.224766 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.225354 master-0 kubenswrapper[16352]: I0307 21:41:52.224818 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.225354 master-0 kubenswrapper[16352]: I0307 21:41:52.224847 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.225354 master-0 kubenswrapper[16352]: I0307 21:41:52.224877 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.227066 master-0 kubenswrapper[16352]: I0307 21:41:52.227037 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts\") pod \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\" (UID: \"2c8b935a-cbf8-4c36-918b-eb0d89edab86\") " Mar 07 21:41:52.230423 master-0 kubenswrapper[16352]: I0307 21:41:52.230363 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x" (OuterVolumeSpecName: "kube-api-access-qz65x") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "kube-api-access-qz65x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:41:52.232552 master-0 kubenswrapper[16352]: I0307 21:41:52.232482 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts" (OuterVolumeSpecName: "scripts") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:52.232552 master-0 kubenswrapper[16352]: I0307 21:41:52.232518 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:52.233741 master-0 kubenswrapper[16352]: I0307 21:41:52.233650 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:52.264746 master-0 kubenswrapper[16352]: I0307 21:41:52.264655 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:52.267273 master-0 kubenswrapper[16352]: I0307 21:41:52.267226 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data" (OuterVolumeSpecName: "config-data") pod "2c8b935a-cbf8-4c36-918b-eb0d89edab86" (UID: "2c8b935a-cbf8-4c36-918b-eb0d89edab86"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331664 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331715 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qz65x\" (UniqueName: \"kubernetes.io/projected/2c8b935a-cbf8-4c36-918b-eb0d89edab86-kube-api-access-qz65x\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331731 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331743 16352 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331752 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.333418 master-0 kubenswrapper[16352]: I0307 21:41:52.331761 16352 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2c8b935a-cbf8-4c36-918b-eb0d89edab86-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:41:52.535095 master-0 kubenswrapper[16352]: I0307 21:41:52.534935 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerStarted","Data":"919ca7a10f7d2057908b264c89ded111932385985e0c2acc946a7e48c2af77dc"} Mar 07 21:41:52.541995 master-0 kubenswrapper[16352]: I0307 21:41:52.539562 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerStarted","Data":"d75b2a2e94bc7bcc078b90db1a505b9b3425739b0825ebdd8b77bcc6009b212e"} Mar 07 21:41:52.545954 master-0 kubenswrapper[16352]: I0307 21:41:52.544782 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-m6pmp" Mar 07 21:41:52.545954 master-0 kubenswrapper[16352]: I0307 21:41:52.544767 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-m6pmp" event={"ID":"2c8b935a-cbf8-4c36-918b-eb0d89edab86","Type":"ContainerDied","Data":"443d3fbb82da10045674595b92d6df413bb31586370ba36c2534f8f6f6d60720"} Mar 07 21:41:52.545954 master-0 kubenswrapper[16352]: I0307 21:41:52.544845 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443d3fbb82da10045674595b92d6df413bb31586370ba36c2534f8f6f6d60720" Mar 07 21:41:52.923621 master-0 kubenswrapper[16352]: I0307 21:41:52.923457 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-213eb-default-internal-api-0" podStartSLOduration=5.923425744 podStartE2EDuration="5.923425744s" podCreationTimestamp="2026-03-07 21:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:52.895279308 +0000 UTC m=+1435.965984387" watchObservedRunningTime="2026-03-07 21:41:52.923425744 +0000 UTC m=+1435.994130813" Mar 07 21:41:53.525667 master-0 kubenswrapper[16352]: I0307 21:41:53.525588 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-m6pmp"] Mar 07 21:41:53.540481 master-0 kubenswrapper[16352]: I0307 21:41:53.540395 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-m6pmp"] Mar 07 21:41:53.559117 master-0 kubenswrapper[16352]: I0307 21:41:53.559029 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerStarted","Data":"2ce34483211f1234aa598d6b4877465bf2b3da6001d0f92033be8f259c68d8a2"} Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.626764 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-zh2n5"] Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: E0307 21:41:53.627527 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c8b935a-cbf8-4c36-918b-eb0d89edab86" containerName="keystone-bootstrap" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.627547 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c8b935a-cbf8-4c36-918b-eb0d89edab86" containerName="keystone-bootstrap" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: E0307 21:41:53.627584 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d3d63cde-c147-4e68-8491-753368687501" containerName="mariadb-account-create-update" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.627591 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="d3d63cde-c147-4e68-8491-753368687501" containerName="mariadb-account-create-update" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.627935 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="d3d63cde-c147-4e68-8491-753368687501" containerName="mariadb-account-create-update" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.628025 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c8b935a-cbf8-4c36-918b-eb0d89edab86" containerName="keystone-bootstrap" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.629058 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.634965 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 07 21:41:53.635824 master-0 kubenswrapper[16352]: I0307 21:41:53.635521 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 07 21:41:53.636354 master-0 kubenswrapper[16352]: I0307 21:41:53.635973 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 21:41:53.636354 master-0 kubenswrapper[16352]: I0307 21:41:53.636135 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 07 21:41:53.647181 master-0 kubenswrapper[16352]: I0307 21:41:53.647076 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-213eb-default-external-api-0" podStartSLOduration=5.647053794 podStartE2EDuration="5.647053794s" podCreationTimestamp="2026-03-07 21:41:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:41:53.627057434 +0000 UTC m=+1436.697762503" watchObservedRunningTime="2026-03-07 21:41:53.647053794 +0000 UTC m=+1436.717758853" Mar 07 21:41:53.647972 master-0 kubenswrapper[16352]: I0307 21:41:53.647893 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zh2n5"] Mar 07 21:41:53.695781 master-0 kubenswrapper[16352]: I0307 21:41:53.693758 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-db-sync-mtvqh"] Mar 07 21:41:53.707638 master-0 kubenswrapper[16352]: I0307 21:41:53.706523 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-mtvqh"] Mar 07 21:41:53.707638 master-0 kubenswrapper[16352]: I0307 21:41:53.706725 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.710984 master-0 kubenswrapper[16352]: I0307 21:41:53.710923 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 07 21:41:53.711463 master-0 kubenswrapper[16352]: I0307 21:41:53.711419 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-scripts" Mar 07 21:41:53.784582 master-0 kubenswrapper[16352]: I0307 21:41:53.784377 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.784582 master-0 kubenswrapper[16352]: I0307 21:41:53.784503 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qwlq\" (UniqueName: \"kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784595 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784670 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784724 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784877 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784938 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785004 master-0 kubenswrapper[16352]: I0307 21:41:53.784994 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.785281 master-0 kubenswrapper[16352]: I0307 21:41:53.785046 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.785281 master-0 kubenswrapper[16352]: I0307 21:41:53.785106 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2nv9\" (UniqueName: \"kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785519 master-0 kubenswrapper[16352]: I0307 21:41:53.785470 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.785603 master-0 kubenswrapper[16352]: I0307 21:41:53.785581 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.803243 master-0 kubenswrapper[16352]: I0307 21:41:53.802973 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:41:53.895374 master-0 kubenswrapper[16352]: I0307 21:41:53.895291 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.895747 master-0 kubenswrapper[16352]: I0307 21:41:53.895601 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.895747 master-0 kubenswrapper[16352]: I0307 21:41:53.895673 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6qwlq\" (UniqueName: \"kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.896076 master-0 kubenswrapper[16352]: I0307 21:41:53.896044 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.896135 master-0 kubenswrapper[16352]: I0307 21:41:53.896111 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.896172 master-0 kubenswrapper[16352]: I0307 21:41:53.896157 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.896584 master-0 kubenswrapper[16352]: I0307 21:41:53.896513 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.896727 master-0 kubenswrapper[16352]: I0307 21:41:53.896671 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.896833 master-0 kubenswrapper[16352]: I0307 21:41:53.896796 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.897071 master-0 kubenswrapper[16352]: I0307 21:41:53.897013 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.897220 master-0 kubenswrapper[16352]: I0307 21:41:53.897183 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2nv9\" (UniqueName: \"kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.897477 master-0 kubenswrapper[16352]: I0307 21:41:53.897443 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.898211 master-0 kubenswrapper[16352]: I0307 21:41:53.898181 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.902030 master-0 kubenswrapper[16352]: I0307 21:41:53.901990 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.902958 master-0 kubenswrapper[16352]: I0307 21:41:53.902905 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.904468 master-0 kubenswrapper[16352]: I0307 21:41:53.904422 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.906011 master-0 kubenswrapper[16352]: I0307 21:41:53.905971 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.906556 master-0 kubenswrapper[16352]: I0307 21:41:53.906509 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.907087 master-0 kubenswrapper[16352]: I0307 21:41:53.907038 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.907282 master-0 kubenswrapper[16352]: I0307 21:41:53.907246 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.908050 master-0 kubenswrapper[16352]: I0307 21:41:53.908011 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.914161 master-0 kubenswrapper[16352]: I0307 21:41:53.914091 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.934798 master-0 kubenswrapper[16352]: I0307 21:41:53.926777 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2nv9\" (UniqueName: \"kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9\") pod \"ironic-db-sync-mtvqh\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:53.934798 master-0 kubenswrapper[16352]: I0307 21:41:53.934120 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qwlq\" (UniqueName: \"kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq\") pod \"keystone-bootstrap-zh2n5\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:53.983308 master-0 kubenswrapper[16352]: I0307 21:41:53.983230 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:41:53.984896 master-0 kubenswrapper[16352]: I0307 21:41:53.983523 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" containerID="cri-o://ffaccf5640c9098eb475fc97a6d7aace0b5d4318b043d69351571c0b81b30a75" gracePeriod=10 Mar 07 21:41:54.031117 master-0 kubenswrapper[16352]: I0307 21:41:54.031051 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:41:54.047135 master-0 kubenswrapper[16352]: I0307 21:41:54.047001 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:41:55.205748 master-0 kubenswrapper[16352]: I0307 21:41:55.204879 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c8b935a-cbf8-4c36-918b-eb0d89edab86" path="/var/lib/kubelet/pods/2c8b935a-cbf8-4c36-918b-eb0d89edab86/volumes" Mar 07 21:41:57.223588 master-0 kubenswrapper[16352]: I0307 21:41:57.223387 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.195:5353: connect: connection refused" Mar 07 21:41:59.150966 master-0 kubenswrapper[16352]: I0307 21:41:59.150875 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:59.150966 master-0 kubenswrapper[16352]: I0307 21:41:59.150975 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:59.208810 master-0 kubenswrapper[16352]: I0307 21:41:59.208644 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:59.208810 master-0 kubenswrapper[16352]: I0307 21:41:59.208744 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:59.658602 master-0 kubenswrapper[16352]: I0307 21:41:59.658527 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:41:59.658602 master-0 kubenswrapper[16352]: I0307 21:41:59.658599 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:42:00.627895 master-0 kubenswrapper[16352]: I0307 21:42:00.623879 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:00.627895 master-0 kubenswrapper[16352]: I0307 21:42:00.623968 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:00.675325 master-0 kubenswrapper[16352]: I0307 21:42:00.673294 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:00.675325 master-0 kubenswrapper[16352]: I0307 21:42:00.674073 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:00.683981 master-0 kubenswrapper[16352]: I0307 21:42:00.683922 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:01.706008 master-0 kubenswrapper[16352]: I0307 21:42:01.705917 16352 generic.go:334] "Generic (PLEG): container finished" podID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerID="ffaccf5640c9098eb475fc97a6d7aace0b5d4318b043d69351571c0b81b30a75" exitCode=0 Mar 07 21:42:01.708376 master-0 kubenswrapper[16352]: I0307 21:42:01.708312 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" event={"ID":"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e","Type":"ContainerDied","Data":"ffaccf5640c9098eb475fc97a6d7aace0b5d4318b043d69351571c0b81b30a75"} Mar 07 21:42:01.708376 master-0 kubenswrapper[16352]: I0307 21:42:01.708374 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:01.778671 master-0 kubenswrapper[16352]: I0307 21:42:01.778572 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:42:01.779088 master-0 kubenswrapper[16352]: I0307 21:42:01.778738 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:42:01.781427 master-0 kubenswrapper[16352]: I0307 21:42:01.781327 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:42:02.718435 master-0 kubenswrapper[16352]: I0307 21:42:02.718321 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:42:02.851586 master-0 kubenswrapper[16352]: I0307 21:42:02.851528 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:03.610464 master-0 kubenswrapper[16352]: I0307 21:42:03.610409 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:42:03.730277 master-0 kubenswrapper[16352]: I0307 21:42:03.730171 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.732183 master-0 kubenswrapper[16352]: I0307 21:42:03.731218 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.732183 master-0 kubenswrapper[16352]: I0307 21:42:03.731294 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kp9dz\" (UniqueName: \"kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.732183 master-0 kubenswrapper[16352]: I0307 21:42:03.731356 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.732183 master-0 kubenswrapper[16352]: I0307 21:42:03.731449 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.732183 master-0 kubenswrapper[16352]: I0307 21:42:03.731534 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb\") pod \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\" (UID: \"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e\") " Mar 07 21:42:03.747796 master-0 kubenswrapper[16352]: I0307 21:42:03.747314 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" event={"ID":"aae009f4-e625-45c1-a7b5-f8f5ca02bb1e","Type":"ContainerDied","Data":"d1d9a7d6c0af8c3298585374f2e00e16e40eb9f4a1714860551e003c14d8d18d"} Mar 07 21:42:03.747796 master-0 kubenswrapper[16352]: I0307 21:42:03.747395 16352 scope.go:117] "RemoveContainer" containerID="ffaccf5640c9098eb475fc97a6d7aace0b5d4318b043d69351571c0b81b30a75" Mar 07 21:42:03.747796 master-0 kubenswrapper[16352]: I0307 21:42:03.747577 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" Mar 07 21:42:03.755596 master-0 kubenswrapper[16352]: I0307 21:42:03.753840 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz" (OuterVolumeSpecName: "kube-api-access-kp9dz") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "kube-api-access-kp9dz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:03.769571 master-0 kubenswrapper[16352]: I0307 21:42:03.769496 16352 generic.go:334] "Generic (PLEG): container finished" podID="b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" containerID="2290b63b0ddc2e5a9d606527b15e706a487fb1316d1c8f0fb7ec06686188f238" exitCode=0 Mar 07 21:42:03.778109 master-0 kubenswrapper[16352]: I0307 21:42:03.776124 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4wkkv" event={"ID":"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985","Type":"ContainerDied","Data":"2290b63b0ddc2e5a9d606527b15e706a487fb1316d1c8f0fb7ec06686188f238"} Mar 07 21:42:03.778109 master-0 kubenswrapper[16352]: I0307 21:42:03.777983 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:42:03.797246 master-0 kubenswrapper[16352]: I0307 21:42:03.796733 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:03.836081 master-0 kubenswrapper[16352]: I0307 21:42:03.831554 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:03.840987 master-0 kubenswrapper[16352]: I0307 21:42:03.837959 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:03.840987 master-0 kubenswrapper[16352]: I0307 21:42:03.838012 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kp9dz\" (UniqueName: \"kubernetes.io/projected/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-kube-api-access-kp9dz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:03.840987 master-0 kubenswrapper[16352]: I0307 21:42:03.838022 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:03.871099 master-0 kubenswrapper[16352]: I0307 21:42:03.864921 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:03.899750 master-0 kubenswrapper[16352]: I0307 21:42:03.879464 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config" (OuterVolumeSpecName: "config") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:03.923230 master-0 kubenswrapper[16352]: I0307 21:42:03.907934 16352 scope.go:117] "RemoveContainer" containerID="736e60a4114fa1b5c5e58807a6770c62c71f078d309417e7b6b756864891ba0e" Mar 07 21:42:03.923230 master-0 kubenswrapper[16352]: I0307 21:42:03.908790 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" (UID: "aae009f4-e625-45c1-a7b5-f8f5ca02bb1e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:03.944383 master-0 kubenswrapper[16352]: I0307 21:42:03.940649 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:03.944383 master-0 kubenswrapper[16352]: I0307 21:42:03.940694 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:03.944383 master-0 kubenswrapper[16352]: I0307 21:42:03.940705 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:04.096962 master-0 kubenswrapper[16352]: I0307 21:42:04.096893 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:42:04.106835 master-0 kubenswrapper[16352]: I0307 21:42:04.106750 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:42:04.130381 master-0 kubenswrapper[16352]: I0307 21:42:04.122650 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6465c5fc85-2kk4v"] Mar 07 21:42:04.203905 master-0 kubenswrapper[16352]: W0307 21:42:04.203826 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2bcad141_6b4f_4b5c_a3e0_236d54fe850c.slice/crio-c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910 WatchSource:0}: Error finding container c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910: Status 404 returned error can't find the container with id c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910 Mar 07 21:42:04.222691 master-0 kubenswrapper[16352]: I0307 21:42:04.221133 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-zh2n5"] Mar 07 21:42:04.233103 master-0 kubenswrapper[16352]: I0307 21:42:04.232994 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 21:42:04.253859 master-0 kubenswrapper[16352]: I0307 21:42:04.253378 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-db-sync-mtvqh"] Mar 07 21:42:04.796738 master-0 kubenswrapper[16352]: I0307 21:42:04.796009 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh2n5" event={"ID":"2bcad141-6b4f-4b5c-a3e0-236d54fe850c","Type":"ContainerStarted","Data":"2268a2fafaeba584ed472288cc56505476177236c2d68526b6349a3f1741d5b3"} Mar 07 21:42:04.796738 master-0 kubenswrapper[16352]: I0307 21:42:04.796086 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh2n5" event={"ID":"2bcad141-6b4f-4b5c-a3e0-236d54fe850c","Type":"ContainerStarted","Data":"c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910"} Mar 07 21:42:04.800730 master-0 kubenswrapper[16352]: I0307 21:42:04.799290 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-db-sync-m7xht" event={"ID":"c17f59fb-df31-45d5-9077-ac10aa310af2","Type":"ContainerStarted","Data":"12f5f40a8fa04773bbd3cbec144234660f8987f2ad2157088c2d0af0fb8c14f7"} Mar 07 21:42:04.804731 master-0 kubenswrapper[16352]: I0307 21:42:04.802483 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-mtvqh" event={"ID":"2a6736c6-a65f-4821-91a1-747418c62459","Type":"ContainerStarted","Data":"d400d729dcdfdb910c4362701c566ac62af786a6b20da71b6e9400fe70e62280"} Mar 07 21:42:04.833520 master-0 kubenswrapper[16352]: I0307 21:42:04.833417 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-zh2n5" podStartSLOduration=11.833393071 podStartE2EDuration="11.833393071s" podCreationTimestamp="2026-03-07 21:41:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:04.825253435 +0000 UTC m=+1447.895958514" watchObservedRunningTime="2026-03-07 21:42:04.833393071 +0000 UTC m=+1447.904098130" Mar 07 21:42:04.868590 master-0 kubenswrapper[16352]: I0307 21:42:04.868485 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-db-sync-m7xht" podStartSLOduration=3.354841877 podStartE2EDuration="22.868457482s" podCreationTimestamp="2026-03-07 21:41:42 +0000 UTC" firstStartedPulling="2026-03-07 21:41:44.394318581 +0000 UTC m=+1427.465023640" lastFinishedPulling="2026-03-07 21:42:03.907934186 +0000 UTC m=+1446.978639245" observedRunningTime="2026-03-07 21:42:04.857417387 +0000 UTC m=+1447.928122456" watchObservedRunningTime="2026-03-07 21:42:04.868457482 +0000 UTC m=+1447.939162541" Mar 07 21:42:05.215217 master-0 kubenswrapper[16352]: I0307 21:42:05.215153 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" path="/var/lib/kubelet/pods/aae009f4-e625-45c1-a7b5-f8f5ca02bb1e/volumes" Mar 07 21:42:05.301146 master-0 kubenswrapper[16352]: I0307 21:42:05.301063 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4wkkv" Mar 07 21:42:05.399633 master-0 kubenswrapper[16352]: I0307 21:42:05.399273 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data\") pod \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " Mar 07 21:42:05.399633 master-0 kubenswrapper[16352]: I0307 21:42:05.399399 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle\") pod \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " Mar 07 21:42:05.399633 master-0 kubenswrapper[16352]: I0307 21:42:05.399541 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs\") pod \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " Mar 07 21:42:05.399633 master-0 kubenswrapper[16352]: I0307 21:42:05.399581 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rbxwk\" (UniqueName: \"kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk\") pod \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " Mar 07 21:42:05.401043 master-0 kubenswrapper[16352]: I0307 21:42:05.399807 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts\") pod \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\" (UID: \"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985\") " Mar 07 21:42:05.401815 master-0 kubenswrapper[16352]: I0307 21:42:05.401480 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs" (OuterVolumeSpecName: "logs") pod "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" (UID: "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:05.406523 master-0 kubenswrapper[16352]: I0307 21:42:05.404342 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk" (OuterVolumeSpecName: "kube-api-access-rbxwk") pod "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" (UID: "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985"). InnerVolumeSpecName "kube-api-access-rbxwk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:05.406523 master-0 kubenswrapper[16352]: I0307 21:42:05.404640 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts" (OuterVolumeSpecName: "scripts") pod "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" (UID: "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:05.436432 master-0 kubenswrapper[16352]: I0307 21:42:05.436341 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data" (OuterVolumeSpecName: "config-data") pod "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" (UID: "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:05.449857 master-0 kubenswrapper[16352]: I0307 21:42:05.448883 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" (UID: "b0c040c8-ff93-4fd1-8f24-41bf2e0a8985"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:05.502945 master-0 kubenswrapper[16352]: I0307 21:42:05.502870 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:05.502945 master-0 kubenswrapper[16352]: I0307 21:42:05.502935 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:05.502945 master-0 kubenswrapper[16352]: I0307 21:42:05.502949 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:05.503301 master-0 kubenswrapper[16352]: I0307 21:42:05.502962 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:05.503301 master-0 kubenswrapper[16352]: I0307 21:42:05.502975 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rbxwk\" (UniqueName: \"kubernetes.io/projected/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985-kube-api-access-rbxwk\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:05.822669 master-0 kubenswrapper[16352]: I0307 21:42:05.822522 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-4wkkv" Mar 07 21:42:05.825643 master-0 kubenswrapper[16352]: I0307 21:42:05.825079 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-4wkkv" event={"ID":"b0c040c8-ff93-4fd1-8f24-41bf2e0a8985","Type":"ContainerDied","Data":"4038a04d366ac1e66ab753c6404d18fab589dd8d8f50eb29e1337922fa518982"} Mar 07 21:42:05.825643 master-0 kubenswrapper[16352]: I0307 21:42:05.825155 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4038a04d366ac1e66ab753c6404d18fab589dd8d8f50eb29e1337922fa518982" Mar 07 21:42:05.970256 master-0 kubenswrapper[16352]: I0307 21:42:05.970008 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:42:05.971506 master-0 kubenswrapper[16352]: E0307 21:42:05.971468 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="init" Mar 07 21:42:05.971506 master-0 kubenswrapper[16352]: I0307 21:42:05.971502 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="init" Mar 07 21:42:05.971613 master-0 kubenswrapper[16352]: E0307 21:42:05.971575 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" containerName="placement-db-sync" Mar 07 21:42:05.971613 master-0 kubenswrapper[16352]: I0307 21:42:05.971587 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" containerName="placement-db-sync" Mar 07 21:42:05.971720 master-0 kubenswrapper[16352]: E0307 21:42:05.971671 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" Mar 07 21:42:05.971720 master-0 kubenswrapper[16352]: I0307 21:42:05.971708 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" Mar 07 21:42:05.974887 master-0 kubenswrapper[16352]: I0307 21:42:05.974841 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" Mar 07 21:42:05.974980 master-0 kubenswrapper[16352]: I0307 21:42:05.974893 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" containerName="placement-db-sync" Mar 07 21:42:05.976939 master-0 kubenswrapper[16352]: I0307 21:42:05.976899 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:05.981115 master-0 kubenswrapper[16352]: I0307 21:42:05.981081 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 07 21:42:05.981594 master-0 kubenswrapper[16352]: I0307 21:42:05.981375 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 07 21:42:05.981657 master-0 kubenswrapper[16352]: I0307 21:42:05.981429 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 07 21:42:05.982213 master-0 kubenswrapper[16352]: I0307 21:42:05.981452 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 07 21:42:06.018786 master-0 kubenswrapper[16352]: I0307 21:42:06.018656 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:42:06.037078 master-0 kubenswrapper[16352]: I0307 21:42:06.037028 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.037743 master-0 kubenswrapper[16352]: I0307 21:42:06.037722 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.037849 master-0 kubenswrapper[16352]: I0307 21:42:06.037833 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xrvv\" (UniqueName: \"kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.038806 master-0 kubenswrapper[16352]: I0307 21:42:06.038788 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.039167 master-0 kubenswrapper[16352]: I0307 21:42:06.039105 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.039367 master-0 kubenswrapper[16352]: I0307 21:42:06.039350 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.039539 master-0 kubenswrapper[16352]: I0307 21:42:06.039525 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.142248 master-0 kubenswrapper[16352]: I0307 21:42:06.141684 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.142248 master-0 kubenswrapper[16352]: I0307 21:42:06.141777 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.142248 master-0 kubenswrapper[16352]: I0307 21:42:06.141823 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.142248 master-0 kubenswrapper[16352]: I0307 21:42:06.141866 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.142248 master-0 kubenswrapper[16352]: I0307 21:42:06.141884 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xrvv\" (UniqueName: \"kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.143231 master-0 kubenswrapper[16352]: I0307 21:42:06.142266 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.143231 master-0 kubenswrapper[16352]: I0307 21:42:06.142308 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.143231 master-0 kubenswrapper[16352]: I0307 21:42:06.142333 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.146292 master-0 kubenswrapper[16352]: I0307 21:42:06.146046 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.146292 master-0 kubenswrapper[16352]: I0307 21:42:06.146244 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.148270 master-0 kubenswrapper[16352]: I0307 21:42:06.147306 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.148270 master-0 kubenswrapper[16352]: I0307 21:42:06.148232 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.150219 master-0 kubenswrapper[16352]: I0307 21:42:06.150081 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.159931 master-0 kubenswrapper[16352]: I0307 21:42:06.159873 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xrvv\" (UniqueName: \"kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv\") pod \"placement-6cc7544794-vmcq4\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.319425 master-0 kubenswrapper[16352]: I0307 21:42:06.319315 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:06.914730 master-0 kubenswrapper[16352]: I0307 21:42:06.912529 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:42:07.226807 master-0 kubenswrapper[16352]: I0307 21:42:07.223743 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-6465c5fc85-2kk4v" podUID="aae009f4-e625-45c1-a7b5-f8f5ca02bb1e" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.195:5353: i/o timeout" Mar 07 21:42:12.262927 master-0 kubenswrapper[16352]: I0307 21:42:12.250099 16352 patch_prober.go:28] interesting pod/catalog-operator-7d9c49f57b-j454x container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.128.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 07 21:42:12.262927 master-0 kubenswrapper[16352]: I0307 21:42:12.250231 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-7d9c49f57b-j454x" podUID="7b89e6e3-1fe4-4ada-a5ca-0d7b2ae16149" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.128.0.25:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 07 21:42:13.342596 master-0 kubenswrapper[16352]: I0307 21:42:13.342486 16352 generic.go:334] "Generic (PLEG): container finished" podID="2a6736c6-a65f-4821-91a1-747418c62459" containerID="180a5b72cc6e6eaf2380e30c0c49902fc2604b0b10fbf55bb493c9857cb4c5c9" exitCode=0 Mar 07 21:42:13.343442 master-0 kubenswrapper[16352]: I0307 21:42:13.342620 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-mtvqh" event={"ID":"2a6736c6-a65f-4821-91a1-747418c62459","Type":"ContainerDied","Data":"180a5b72cc6e6eaf2380e30c0c49902fc2604b0b10fbf55bb493c9857cb4c5c9"} Mar 07 21:42:13.347817 master-0 kubenswrapper[16352]: I0307 21:42:13.347766 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerStarted","Data":"a039f4a32907524546ecdece0f12ff81c8e16de6f2c7d7dbc0e3d8831b933b8c"} Mar 07 21:42:13.347894 master-0 kubenswrapper[16352]: I0307 21:42:13.347820 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerStarted","Data":"56401df6f757863c0a67a6d2486cdef43b3b8a5aea7f0f3263e07faea84a7d68"} Mar 07 21:42:13.347894 master-0 kubenswrapper[16352]: I0307 21:42:13.347834 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerStarted","Data":"a1cc218270e0db485436dae256ca4c7372913f7f349d674148a4b32c43907635"} Mar 07 21:42:13.348455 master-0 kubenswrapper[16352]: I0307 21:42:13.348409 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:13.348455 master-0 kubenswrapper[16352]: I0307 21:42:13.348445 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:13.351151 master-0 kubenswrapper[16352]: I0307 21:42:13.350674 16352 generic.go:334] "Generic (PLEG): container finished" podID="2bcad141-6b4f-4b5c-a3e0-236d54fe850c" containerID="2268a2fafaeba584ed472288cc56505476177236c2d68526b6349a3f1741d5b3" exitCode=0 Mar 07 21:42:13.351151 master-0 kubenswrapper[16352]: I0307 21:42:13.350740 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh2n5" event={"ID":"2bcad141-6b4f-4b5c-a3e0-236d54fe850c","Type":"ContainerDied","Data":"2268a2fafaeba584ed472288cc56505476177236c2d68526b6349a3f1741d5b3"} Mar 07 21:42:13.437436 master-0 kubenswrapper[16352]: I0307 21:42:13.437028 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-6cc7544794-vmcq4" podStartSLOduration=8.437005062 podStartE2EDuration="8.437005062s" podCreationTimestamp="2026-03-07 21:42:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:13.423815076 +0000 UTC m=+1456.494520155" watchObservedRunningTime="2026-03-07 21:42:13.437005062 +0000 UTC m=+1456.507710121" Mar 07 21:42:14.370844 master-0 kubenswrapper[16352]: I0307 21:42:14.370720 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-mtvqh" event={"ID":"2a6736c6-a65f-4821-91a1-747418c62459","Type":"ContainerStarted","Data":"6cdac522b3b5a0c31b3bde57331a76e735d384d73d686550f0182c05c512a505"} Mar 07 21:42:14.414355 master-0 kubenswrapper[16352]: I0307 21:42:14.413950 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-db-sync-mtvqh" podStartSLOduration=13.168972278 podStartE2EDuration="21.41391827s" podCreationTimestamp="2026-03-07 21:41:53 +0000 UTC" firstStartedPulling="2026-03-07 21:42:04.29470389 +0000 UTC m=+1447.365408949" lastFinishedPulling="2026-03-07 21:42:12.539649882 +0000 UTC m=+1455.610354941" observedRunningTime="2026-03-07 21:42:14.400596821 +0000 UTC m=+1457.471301890" watchObservedRunningTime="2026-03-07 21:42:14.41391827 +0000 UTC m=+1457.484623329" Mar 07 21:42:14.852280 master-0 kubenswrapper[16352]: I0307 21:42:14.852222 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:42:14.853920 master-0 kubenswrapper[16352]: I0307 21:42:14.853861 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qwlq\" (UniqueName: \"kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.853999 master-0 kubenswrapper[16352]: I0307 21:42:14.853946 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.854044 master-0 kubenswrapper[16352]: I0307 21:42:14.854006 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.854729 master-0 kubenswrapper[16352]: I0307 21:42:14.854075 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.854729 master-0 kubenswrapper[16352]: I0307 21:42:14.854113 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.854729 master-0 kubenswrapper[16352]: I0307 21:42:14.854277 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:14.857564 master-0 kubenswrapper[16352]: I0307 21:42:14.857487 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:14.857564 master-0 kubenswrapper[16352]: I0307 21:42:14.857515 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq" (OuterVolumeSpecName: "kube-api-access-6qwlq") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "kube-api-access-6qwlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:14.858653 master-0 kubenswrapper[16352]: I0307 21:42:14.858484 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:14.859762 master-0 kubenswrapper[16352]: I0307 21:42:14.859725 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts" (OuterVolumeSpecName: "scripts") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:14.893483 master-0 kubenswrapper[16352]: E0307 21:42:14.893408 16352 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle podName:2bcad141-6b4f-4b5c-a3e0-236d54fe850c nodeName:}" failed. No retries permitted until 2026-03-07 21:42:15.39337001 +0000 UTC m=+1458.464075069 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "combined-ca-bundle" (UniqueName: "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c") : error deleting /var/lib/kubelet/pods/2bcad141-6b4f-4b5c-a3e0-236d54fe850c/volume-subpaths: remove /var/lib/kubelet/pods/2bcad141-6b4f-4b5c-a3e0-236d54fe850c/volume-subpaths: no such file or directory Mar 07 21:42:14.898991 master-0 kubenswrapper[16352]: I0307 21:42:14.898929 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data" (OuterVolumeSpecName: "config-data") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:14.957089 master-0 kubenswrapper[16352]: I0307 21:42:14.956183 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:14.957089 master-0 kubenswrapper[16352]: I0307 21:42:14.956230 16352 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-credential-keys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:14.957089 master-0 kubenswrapper[16352]: I0307 21:42:14.956244 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6qwlq\" (UniqueName: \"kubernetes.io/projected/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-kube-api-access-6qwlq\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:14.957089 master-0 kubenswrapper[16352]: I0307 21:42:14.956258 16352 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:14.957089 master-0 kubenswrapper[16352]: I0307 21:42:14.956267 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:15.383898 master-0 kubenswrapper[16352]: I0307 21:42:15.383808 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-zh2n5" event={"ID":"2bcad141-6b4f-4b5c-a3e0-236d54fe850c","Type":"ContainerDied","Data":"c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910"} Mar 07 21:42:15.383898 master-0 kubenswrapper[16352]: I0307 21:42:15.383887 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c9ec7f655936c2bc62307b0b463364a71534bb981e11e4c386c166c5c91be910" Mar 07 21:42:15.383898 master-0 kubenswrapper[16352]: I0307 21:42:15.383885 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-zh2n5" Mar 07 21:42:15.473900 master-0 kubenswrapper[16352]: I0307 21:42:15.473818 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") pod \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\" (UID: \"2bcad141-6b4f-4b5c-a3e0-236d54fe850c\") " Mar 07 21:42:15.478473 master-0 kubenswrapper[16352]: I0307 21:42:15.478409 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2bcad141-6b4f-4b5c-a3e0-236d54fe850c" (UID: "2bcad141-6b4f-4b5c-a3e0-236d54fe850c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:15.577991 master-0 kubenswrapper[16352]: I0307 21:42:15.577921 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2bcad141-6b4f-4b5c-a3e0-236d54fe850c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:15.626526 master-0 kubenswrapper[16352]: I0307 21:42:15.626440 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-798d5f97fb-2sbnv"] Mar 07 21:42:15.633204 master-0 kubenswrapper[16352]: E0307 21:42:15.633119 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bcad141-6b4f-4b5c-a3e0-236d54fe850c" containerName="keystone-bootstrap" Mar 07 21:42:15.633204 master-0 kubenswrapper[16352]: I0307 21:42:15.633189 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bcad141-6b4f-4b5c-a3e0-236d54fe850c" containerName="keystone-bootstrap" Mar 07 21:42:15.635389 master-0 kubenswrapper[16352]: I0307 21:42:15.633908 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bcad141-6b4f-4b5c-a3e0-236d54fe850c" containerName="keystone-bootstrap" Mar 07 21:42:15.650190 master-0 kubenswrapper[16352]: I0307 21:42:15.650104 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.653507 master-0 kubenswrapper[16352]: I0307 21:42:15.653247 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 07 21:42:15.655257 master-0 kubenswrapper[16352]: I0307 21:42:15.653682 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 07 21:42:15.666087 master-0 kubenswrapper[16352]: I0307 21:42:15.665777 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-798d5f97fb-2sbnv"] Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681083 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-internal-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681211 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-config-data\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681264 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8qz9j\" (UniqueName: \"kubernetes.io/projected/f2667b0f-a84d-4dea-9af1-8782f54f0464-kube-api-access-8qz9j\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681480 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-public-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681538 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-scripts\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681676 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-fernet-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681716 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-combined-ca-bundle\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.682937 master-0 kubenswrapper[16352]: I0307 21:42:15.681740 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-credential-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783025 master-0 kubenswrapper[16352]: I0307 21:42:15.782935 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-scripts\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783104 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-fernet-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783134 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-combined-ca-bundle\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783157 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-credential-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783179 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-internal-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783212 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-config-data\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783297 master-0 kubenswrapper[16352]: I0307 21:42:15.783252 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8qz9j\" (UniqueName: \"kubernetes.io/projected/f2667b0f-a84d-4dea-9af1-8782f54f0464-kube-api-access-8qz9j\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.783728 master-0 kubenswrapper[16352]: I0307 21:42:15.783349 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-public-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.787255 master-0 kubenswrapper[16352]: I0307 21:42:15.787207 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-public-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.788092 master-0 kubenswrapper[16352]: I0307 21:42:15.788035 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-config-data\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.788645 master-0 kubenswrapper[16352]: I0307 21:42:15.788608 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-credential-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.789181 master-0 kubenswrapper[16352]: I0307 21:42:15.789149 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-combined-ca-bundle\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.790189 master-0 kubenswrapper[16352]: I0307 21:42:15.789780 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-scripts\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.790189 master-0 kubenswrapper[16352]: I0307 21:42:15.790033 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-internal-tls-certs\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.790909 master-0 kubenswrapper[16352]: I0307 21:42:15.790875 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f2667b0f-a84d-4dea-9af1-8782f54f0464-fernet-keys\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.803950 master-0 kubenswrapper[16352]: I0307 21:42:15.803901 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8qz9j\" (UniqueName: \"kubernetes.io/projected/f2667b0f-a84d-4dea-9af1-8782f54f0464-kube-api-access-8qz9j\") pod \"keystone-798d5f97fb-2sbnv\" (UID: \"f2667b0f-a84d-4dea-9af1-8782f54f0464\") " pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:15.992533 master-0 kubenswrapper[16352]: I0307 21:42:15.992350 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:16.471748 master-0 kubenswrapper[16352]: I0307 21:42:16.469443 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-798d5f97fb-2sbnv"] Mar 07 21:42:17.409303 master-0 kubenswrapper[16352]: I0307 21:42:17.409227 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-798d5f97fb-2sbnv" event={"ID":"f2667b0f-a84d-4dea-9af1-8782f54f0464","Type":"ContainerStarted","Data":"5b1ba5abd1fb71a28299c83bb1d8a452fb1738ffd67e1fcb8b14d30677d43c53"} Mar 07 21:42:17.409303 master-0 kubenswrapper[16352]: I0307 21:42:17.409295 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-798d5f97fb-2sbnv" event={"ID":"f2667b0f-a84d-4dea-9af1-8782f54f0464","Type":"ContainerStarted","Data":"7896fb3422efb3c1fb7832a085327fb367f1c3e0f85062d6dfd847d76c04ffb0"} Mar 07 21:42:17.409607 master-0 kubenswrapper[16352]: I0307 21:42:17.409457 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:17.441925 master-0 kubenswrapper[16352]: I0307 21:42:17.441822 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-798d5f97fb-2sbnv" podStartSLOduration=2.441797012 podStartE2EDuration="2.441797012s" podCreationTimestamp="2026-03-07 21:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:17.440863649 +0000 UTC m=+1460.511568708" watchObservedRunningTime="2026-03-07 21:42:17.441797012 +0000 UTC m=+1460.512502071" Mar 07 21:42:18.423571 master-0 kubenswrapper[16352]: I0307 21:42:18.423476 16352 generic.go:334] "Generic (PLEG): container finished" podID="c17f59fb-df31-45d5-9077-ac10aa310af2" containerID="12f5f40a8fa04773bbd3cbec144234660f8987f2ad2157088c2d0af0fb8c14f7" exitCode=0 Mar 07 21:42:18.425018 master-0 kubenswrapper[16352]: I0307 21:42:18.423603 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-db-sync-m7xht" event={"ID":"c17f59fb-df31-45d5-9077-ac10aa310af2","Type":"ContainerDied","Data":"12f5f40a8fa04773bbd3cbec144234660f8987f2ad2157088c2d0af0fb8c14f7"} Mar 07 21:42:19.915000 master-0 kubenswrapper[16352]: I0307 21:42:19.914908 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953374 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v25l\" (UniqueName: \"kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953494 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953659 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953733 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953759 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.953879 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle\") pod \"c17f59fb-df31-45d5-9077-ac10aa310af2\" (UID: \"c17f59fb-df31-45d5-9077-ac10aa310af2\") " Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.954128 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:19.955262 master-0 kubenswrapper[16352]: I0307 21:42:19.954541 16352 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c17f59fb-df31-45d5-9077-ac10aa310af2-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:19.975205 master-0 kubenswrapper[16352]: I0307 21:42:19.975101 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:19.975860 master-0 kubenswrapper[16352]: I0307 21:42:19.975788 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts" (OuterVolumeSpecName: "scripts") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:19.977779 master-0 kubenswrapper[16352]: I0307 21:42:19.977651 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l" (OuterVolumeSpecName: "kube-api-access-9v25l") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "kube-api-access-9v25l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:19.988568 master-0 kubenswrapper[16352]: I0307 21:42:19.988489 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:20.010538 master-0 kubenswrapper[16352]: I0307 21:42:20.010437 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data" (OuterVolumeSpecName: "config-data") pod "c17f59fb-df31-45d5-9077-ac10aa310af2" (UID: "c17f59fb-df31-45d5-9077-ac10aa310af2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:20.058670 master-0 kubenswrapper[16352]: I0307 21:42:20.058495 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:20.059300 master-0 kubenswrapper[16352]: I0307 21:42:20.058757 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:20.059300 master-0 kubenswrapper[16352]: I0307 21:42:20.058783 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:20.059300 master-0 kubenswrapper[16352]: I0307 21:42:20.058806 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v25l\" (UniqueName: \"kubernetes.io/projected/c17f59fb-df31-45d5-9077-ac10aa310af2-kube-api-access-9v25l\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:20.059300 master-0 kubenswrapper[16352]: I0307 21:42:20.058825 16352 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/c17f59fb-df31-45d5-9077-ac10aa310af2-db-sync-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:20.454110 master-0 kubenswrapper[16352]: I0307 21:42:20.454038 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-db-sync-m7xht" event={"ID":"c17f59fb-df31-45d5-9077-ac10aa310af2","Type":"ContainerDied","Data":"9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85"} Mar 07 21:42:20.454110 master-0 kubenswrapper[16352]: I0307 21:42:20.454096 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9768b16e24c5ec52ce4a5a132c04c14cef91bc209e2b69a0afa633a6de221b85" Mar 07 21:42:20.454426 master-0 kubenswrapper[16352]: I0307 21:42:20.454153 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-db-sync-m7xht" Mar 07 21:42:20.839068 master-0 kubenswrapper[16352]: I0307 21:42:20.838137 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:20.839068 master-0 kubenswrapper[16352]: E0307 21:42:20.838728 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c17f59fb-df31-45d5-9077-ac10aa310af2" containerName="cinder-86971-db-sync" Mar 07 21:42:20.839068 master-0 kubenswrapper[16352]: I0307 21:42:20.838745 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c17f59fb-df31-45d5-9077-ac10aa310af2" containerName="cinder-86971-db-sync" Mar 07 21:42:20.839068 master-0 kubenswrapper[16352]: I0307 21:42:20.839007 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c17f59fb-df31-45d5-9077-ac10aa310af2" containerName="cinder-86971-db-sync" Mar 07 21:42:20.840426 master-0 kubenswrapper[16352]: I0307 21:42:20.840381 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.848725 master-0 kubenswrapper[16352]: I0307 21:42:20.847363 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-scripts" Mar 07 21:42:20.848725 master-0 kubenswrapper[16352]: I0307 21:42:20.847760 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-config-data" Mar 07 21:42:20.848725 master-0 kubenswrapper[16352]: I0307 21:42:20.847942 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-scheduler-config-data" Mar 07 21:42:20.852543 master-0 kubenswrapper[16352]: I0307 21:42:20.852477 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:20.890710 master-0 kubenswrapper[16352]: I0307 21:42:20.890537 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.890710 master-0 kubenswrapper[16352]: I0307 21:42:20.890624 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.891025 master-0 kubenswrapper[16352]: I0307 21:42:20.890782 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.891025 master-0 kubenswrapper[16352]: I0307 21:42:20.890896 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.891025 master-0 kubenswrapper[16352]: I0307 21:42:20.890934 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.891025 master-0 kubenswrapper[16352]: I0307 21:42:20.891004 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftr6q\" (UniqueName: \"kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.994935 master-0 kubenswrapper[16352]: I0307 21:42:20.994868 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.995824 master-0 kubenswrapper[16352]: I0307 21:42:20.995801 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.995946 master-0 kubenswrapper[16352]: I0307 21:42:20.995929 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.996172 master-0 kubenswrapper[16352]: I0307 21:42:20.996057 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftr6q\" (UniqueName: \"kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.996315 master-0 kubenswrapper[16352]: I0307 21:42:20.996299 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.996411 master-0 kubenswrapper[16352]: I0307 21:42:20.996395 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:20.998891 master-0 kubenswrapper[16352]: I0307 21:42:20.998812 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:20.999645 master-0 kubenswrapper[16352]: I0307 21:42:20.999476 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.002146 master-0 kubenswrapper[16352]: I0307 21:42:21.001444 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.002146 master-0 kubenswrapper[16352]: I0307 21:42:21.001495 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.002146 master-0 kubenswrapper[16352]: I0307 21:42:21.002087 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.002514 master-0 kubenswrapper[16352]: I0307 21:42:21.002473 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.013107 master-0 kubenswrapper[16352]: I0307 21:42:21.012847 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.035266 master-0 kubenswrapper[16352]: I0307 21:42:21.033769 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftr6q\" (UniqueName: \"kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q\") pod \"cinder-86971-scheduler-0\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.102953 master-0 kubenswrapper[16352]: I0307 21:42:21.102894 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjnj9\" (UniqueName: \"kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.103372 master-0 kubenswrapper[16352]: I0307 21:42:21.103350 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.103496 master-0 kubenswrapper[16352]: I0307 21:42:21.103480 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.103621 master-0 kubenswrapper[16352]: I0307 21:42:21.103580 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.103775 master-0 kubenswrapper[16352]: I0307 21:42:21.103760 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.103959 master-0 kubenswrapper[16352]: I0307 21:42:21.103944 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.105140 master-0 kubenswrapper[16352]: I0307 21:42:21.105101 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:21.148736 master-0 kubenswrapper[16352]: I0307 21:42:21.148530 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:21.154503 master-0 kubenswrapper[16352]: I0307 21:42:21.154123 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.169003 master-0 kubenswrapper[16352]: I0307 21:42:21.168523 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-volume-lvm-iscsi-config-data" Mar 07 21:42:21.173973 master-0 kubenswrapper[16352]: I0307 21:42:21.173921 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:21.206283 master-0 kubenswrapper[16352]: I0307 21:42:21.206221 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.206637 master-0 kubenswrapper[16352]: I0307 21:42:21.206615 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.206810 master-0 kubenswrapper[16352]: I0307 21:42:21.206793 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.206967 master-0 kubenswrapper[16352]: I0307 21:42:21.206948 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.207108 master-0 kubenswrapper[16352]: I0307 21:42:21.207087 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.207248 master-0 kubenswrapper[16352]: I0307 21:42:21.207230 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rjnj9\" (UniqueName: \"kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.207587 master-0 kubenswrapper[16352]: I0307 21:42:21.207541 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.207650 master-0 kubenswrapper[16352]: I0307 21:42:21.207589 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.207650 master-0 kubenswrapper[16352]: I0307 21:42:21.207626 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.208614 master-0 kubenswrapper[16352]: I0307 21:42:21.208574 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.210817 master-0 kubenswrapper[16352]: I0307 21:42:21.210780 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.234544 master-0 kubenswrapper[16352]: I0307 21:42:21.234472 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:21.240594 master-0 kubenswrapper[16352]: I0307 21:42:21.240553 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rjnj9\" (UniqueName: \"kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9\") pod \"dnsmasq-dns-589dd8c5c-bm6b7\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.241711 master-0 kubenswrapper[16352]: I0307 21:42:21.241648 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:21.241882 master-0 kubenswrapper[16352]: I0307 21:42:21.241841 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.244192 master-0 kubenswrapper[16352]: I0307 21:42:21.244167 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-backup-config-data" Mar 07 21:42:21.300944 master-0 kubenswrapper[16352]: I0307 21:42:21.300875 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:21.309472 master-0 kubenswrapper[16352]: I0307 21:42:21.309406 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.309807 master-0 kubenswrapper[16352]: I0307 21:42:21.309781 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310040 master-0 kubenswrapper[16352]: I0307 21:42:21.310017 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310171 master-0 kubenswrapper[16352]: I0307 21:42:21.310157 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2h5g2\" (UniqueName: \"kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310257 master-0 kubenswrapper[16352]: I0307 21:42:21.310244 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310415 master-0 kubenswrapper[16352]: I0307 21:42:21.310395 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310551 master-0 kubenswrapper[16352]: I0307 21:42:21.310530 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310716 master-0 kubenswrapper[16352]: I0307 21:42:21.310674 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310833 master-0 kubenswrapper[16352]: I0307 21:42:21.310816 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.310952 master-0 kubenswrapper[16352]: I0307 21:42:21.310931 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.311068 master-0 kubenswrapper[16352]: I0307 21:42:21.311048 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.311215 master-0 kubenswrapper[16352]: I0307 21:42:21.311195 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.311359 master-0 kubenswrapper[16352]: I0307 21:42:21.311342 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.311495 master-0 kubenswrapper[16352]: I0307 21:42:21.311481 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.311596 master-0 kubenswrapper[16352]: I0307 21:42:21.311581 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.419206 master-0 kubenswrapper[16352]: I0307 21:42:21.419041 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424340 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424465 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424537 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424641 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6j8h\" (UniqueName: \"kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424722 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424763 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424817 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424858 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424914 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2h5g2\" (UniqueName: \"kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.424951 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425036 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425073 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425103 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425133 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425165 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425214 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425263 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425317 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425354 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425392 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425453 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425485 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425514 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425539 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.425579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.427494 master-0 kubenswrapper[16352]: I0307 21:42:21.426587 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.428502 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.429227 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.429296 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.429337 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430104 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430180 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430247 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430340 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430409 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.430480 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.431484 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.431541 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.431617 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.431671 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.432159 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.437969 master-0 kubenswrapper[16352]: I0307 21:42:21.436862 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.439963 master-0 kubenswrapper[16352]: I0307 21:42:21.438144 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.454738 master-0 kubenswrapper[16352]: I0307 21:42:21.448631 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.454738 master-0 kubenswrapper[16352]: I0307 21:42:21.448765 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:21.454738 master-0 kubenswrapper[16352]: I0307 21:42:21.451577 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.461724 master-0 kubenswrapper[16352]: I0307 21:42:21.460002 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2h5g2\" (UniqueName: \"kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.481745 master-0 kubenswrapper[16352]: I0307 21:42:21.479015 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-api-config-data" Mar 07 21:42:21.487553 master-0 kubenswrapper[16352]: I0307 21:42:21.486120 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:21.488359 master-0 kubenswrapper[16352]: I0307 21:42:21.488301 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:21.538861 master-0 kubenswrapper[16352]: I0307 21:42:21.538580 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6j8h\" (UniqueName: \"kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.538861 master-0 kubenswrapper[16352]: I0307 21:42:21.538658 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.538861 master-0 kubenswrapper[16352]: I0307 21:42:21.538726 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.538861 master-0 kubenswrapper[16352]: I0307 21:42:21.538760 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.539854 master-0 kubenswrapper[16352]: I0307 21:42:21.539791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.539958 master-0 kubenswrapper[16352]: I0307 21:42:21.539860 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.540116 master-0 kubenswrapper[16352]: I0307 21:42:21.539892 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.540388 master-0 kubenswrapper[16352]: I0307 21:42:21.540331 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540596 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540656 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540745 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540773 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540840 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.540919 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.541252 master-0 kubenswrapper[16352]: I0307 21:42:21.541138 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.544525 master-0 kubenswrapper[16352]: I0307 21:42:21.543767 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.548179 master-0 kubenswrapper[16352]: I0307 21:42:21.548027 16352 generic.go:334] "Generic (PLEG): container finished" podID="218021bb-e4db-42b1-a553-f2a373cd9565" containerID="8e7336a0eb6ab818f0c8d258f43028b48d1c7900ff93861475b25e2a1a77ecd8" exitCode=0 Mar 07 21:42:21.548179 master-0 kubenswrapper[16352]: I0307 21:42:21.548111 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97jz8" event={"ID":"218021bb-e4db-42b1-a553-f2a373cd9565","Type":"ContainerDied","Data":"8e7336a0eb6ab818f0c8d258f43028b48d1c7900ff93861475b25e2a1a77ecd8"} Mar 07 21:42:21.563243 master-0 kubenswrapper[16352]: I0307 21:42:21.563178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563594 master-0 kubenswrapper[16352]: I0307 21:42:21.563429 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563594 master-0 kubenswrapper[16352]: I0307 21:42:21.563519 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563733 master-0 kubenswrapper[16352]: I0307 21:42:21.563618 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563733 master-0 kubenswrapper[16352]: I0307 21:42:21.563668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563833 master-0 kubenswrapper[16352]: I0307 21:42:21.563738 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563833 master-0 kubenswrapper[16352]: I0307 21:42:21.563803 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563927 master-0 kubenswrapper[16352]: I0307 21:42:21.563851 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.563927 master-0 kubenswrapper[16352]: I0307 21:42:21.563912 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.564553 master-0 kubenswrapper[16352]: I0307 21:42:21.564525 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.577875 master-0 kubenswrapper[16352]: I0307 21:42:21.572170 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.578797 master-0 kubenswrapper[16352]: I0307 21:42:21.578599 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.584973 master-0 kubenswrapper[16352]: I0307 21:42:21.584928 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6j8h\" (UniqueName: \"kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.603866 master-0 kubenswrapper[16352]: I0307 21:42:21.603823 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts\") pod \"cinder-86971-backup-0\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.665372 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2ks5v\" (UniqueName: \"kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.665540 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.665748 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.665903 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.666325 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.666579 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.669611 master-0 kubenswrapper[16352]: I0307 21:42:21.666979 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.771821 master-0 kubenswrapper[16352]: I0307 21:42:21.771428 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.771821 master-0 kubenswrapper[16352]: I0307 21:42:21.771571 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2ks5v\" (UniqueName: \"kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.771821 master-0 kubenswrapper[16352]: I0307 21:42:21.771737 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.771821 master-0 kubenswrapper[16352]: I0307 21:42:21.771817 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.771821 master-0 kubenswrapper[16352]: I0307 21:42:21.771924 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.772368 master-0 kubenswrapper[16352]: I0307 21:42:21.772008 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.772704 master-0 kubenswrapper[16352]: I0307 21:42:21.772499 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.772704 master-0 kubenswrapper[16352]: I0307 21:42:21.772548 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.772704 master-0 kubenswrapper[16352]: I0307 21:42:21.772599 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.777422 master-0 kubenswrapper[16352]: I0307 21:42:21.777364 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.777608 master-0 kubenswrapper[16352]: I0307 21:42:21.777571 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.781566 master-0 kubenswrapper[16352]: I0307 21:42:21.781525 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.789904 master-0 kubenswrapper[16352]: I0307 21:42:21.789858 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2ks5v\" (UniqueName: \"kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.794695 master-0 kubenswrapper[16352]: I0307 21:42:21.794628 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts\") pod \"cinder-86971-api-0\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.838336 master-0 kubenswrapper[16352]: I0307 21:42:21.835797 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:21.903413 master-0 kubenswrapper[16352]: I0307 21:42:21.902902 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:21.953503 master-0 kubenswrapper[16352]: I0307 21:42:21.951469 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:22.155024 master-0 kubenswrapper[16352]: W0307 21:42:22.154966 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3666942_0515_42ad_aa2a_20be90d7bc83.slice/crio-758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a WatchSource:0}: Error finding container 758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a: Status 404 returned error can't find the container with id 758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a Mar 07 21:42:22.168405 master-0 kubenswrapper[16352]: W0307 21:42:22.168315 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda962b1a6_daaa_4ffe_9395_419843cd2a6f.slice/crio-4e902407f8d6ba830ab9903b784ff4c613eb648cec02845b509b45a66f941ccb WatchSource:0}: Error finding container 4e902407f8d6ba830ab9903b784ff4c613eb648cec02845b509b45a66f941ccb: Status 404 returned error can't find the container with id 4e902407f8d6ba830ab9903b784ff4c613eb648cec02845b509b45a66f941ccb Mar 07 21:42:22.181939 master-0 kubenswrapper[16352]: I0307 21:42:22.181888 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:22.211048 master-0 kubenswrapper[16352]: I0307 21:42:22.205157 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:22.470526 master-0 kubenswrapper[16352]: I0307 21:42:22.470454 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:22.590837 master-0 kubenswrapper[16352]: I0307 21:42:22.590696 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerStarted","Data":"90001a728a47c32c0357761d9ffbfda8c8b4ade72dab2dd1fa37eb7eef6323ad"} Mar 07 21:42:22.596791 master-0 kubenswrapper[16352]: I0307 21:42:22.595795 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerStarted","Data":"758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a"} Mar 07 21:42:22.600936 master-0 kubenswrapper[16352]: I0307 21:42:22.600796 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerStarted","Data":"4740bf7f572a2694b4a5f63ba5ef9cafdfc32c7c23a55ca06e8b05e62f2e41ad"} Mar 07 21:42:22.606899 master-0 kubenswrapper[16352]: I0307 21:42:22.606647 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" event={"ID":"a962b1a6-daaa-4ffe-9395-419843cd2a6f","Type":"ContainerDied","Data":"998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3"} Mar 07 21:42:22.607992 master-0 kubenswrapper[16352]: I0307 21:42:22.606492 16352 generic.go:334] "Generic (PLEG): container finished" podID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerID="998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3" exitCode=0 Mar 07 21:42:22.607992 master-0 kubenswrapper[16352]: I0307 21:42:22.607645 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" event={"ID":"a962b1a6-daaa-4ffe-9395-419843cd2a6f","Type":"ContainerStarted","Data":"4e902407f8d6ba830ab9903b784ff4c613eb648cec02845b509b45a66f941ccb"} Mar 07 21:42:22.768709 master-0 kubenswrapper[16352]: I0307 21:42:22.768310 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:22.836575 master-0 kubenswrapper[16352]: W0307 21:42:22.836499 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod801bdc03_e76c_4dfb_a97e_1327be4a522a.slice/crio-6c16c6d136eddd13f7f322d69ffa4bf4d9e645a845caded0ab71b591757e630b WatchSource:0}: Error finding container 6c16c6d136eddd13f7f322d69ffa4bf4d9e645a845caded0ab71b591757e630b: Status 404 returned error can't find the container with id 6c16c6d136eddd13f7f322d69ffa4bf4d9e645a845caded0ab71b591757e630b Mar 07 21:42:23.422129 master-0 kubenswrapper[16352]: I0307 21:42:23.420555 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97jz8" Mar 07 21:42:23.533889 master-0 kubenswrapper[16352]: I0307 21:42:23.531834 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config\") pod \"218021bb-e4db-42b1-a553-f2a373cd9565\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " Mar 07 21:42:23.533889 master-0 kubenswrapper[16352]: I0307 21:42:23.532277 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle\") pod \"218021bb-e4db-42b1-a553-f2a373cd9565\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " Mar 07 21:42:23.533889 master-0 kubenswrapper[16352]: I0307 21:42:23.532359 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kwhp6\" (UniqueName: \"kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6\") pod \"218021bb-e4db-42b1-a553-f2a373cd9565\" (UID: \"218021bb-e4db-42b1-a553-f2a373cd9565\") " Mar 07 21:42:23.558396 master-0 kubenswrapper[16352]: I0307 21:42:23.555444 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6" (OuterVolumeSpecName: "kube-api-access-kwhp6") pod "218021bb-e4db-42b1-a553-f2a373cd9565" (UID: "218021bb-e4db-42b1-a553-f2a373cd9565"). InnerVolumeSpecName "kube-api-access-kwhp6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:23.638784 master-0 kubenswrapper[16352]: I0307 21:42:23.636431 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kwhp6\" (UniqueName: \"kubernetes.io/projected/218021bb-e4db-42b1-a553-f2a373cd9565-kube-api-access-kwhp6\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:23.639335 master-0 kubenswrapper[16352]: I0307 21:42:23.639079 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config" (OuterVolumeSpecName: "config") pod "218021bb-e4db-42b1-a553-f2a373cd9565" (UID: "218021bb-e4db-42b1-a553-f2a373cd9565"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:23.639335 master-0 kubenswrapper[16352]: I0307 21:42:23.639188 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "218021bb-e4db-42b1-a553-f2a373cd9565" (UID: "218021bb-e4db-42b1-a553-f2a373cd9565"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:23.659646 master-0 kubenswrapper[16352]: I0307 21:42:23.658060 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-97jz8" event={"ID":"218021bb-e4db-42b1-a553-f2a373cd9565","Type":"ContainerDied","Data":"93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16"} Mar 07 21:42:23.659646 master-0 kubenswrapper[16352]: I0307 21:42:23.658109 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="93b7f4e3ae4f2429498f15d2af853fd9d8540ff43a42b6c42a24df6e86b82b16" Mar 07 21:42:23.659646 master-0 kubenswrapper[16352]: I0307 21:42:23.658202 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-97jz8" Mar 07 21:42:23.668457 master-0 kubenswrapper[16352]: I0307 21:42:23.667936 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" event={"ID":"a962b1a6-daaa-4ffe-9395-419843cd2a6f","Type":"ContainerStarted","Data":"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7"} Mar 07 21:42:23.668457 master-0 kubenswrapper[16352]: I0307 21:42:23.668360 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:23.672299 master-0 kubenswrapper[16352]: I0307 21:42:23.672203 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerStarted","Data":"6c16c6d136eddd13f7f322d69ffa4bf4d9e645a845caded0ab71b591757e630b"} Mar 07 21:42:23.712785 master-0 kubenswrapper[16352]: I0307 21:42:23.712179 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:23.712785 master-0 kubenswrapper[16352]: I0307 21:42:23.712679 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" podStartSLOduration=3.712651137 podStartE2EDuration="3.712651137s" podCreationTimestamp="2026-03-07 21:42:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:23.695902895 +0000 UTC m=+1466.766607984" watchObservedRunningTime="2026-03-07 21:42:23.712651137 +0000 UTC m=+1466.783356196" Mar 07 21:42:23.739392 master-0 kubenswrapper[16352]: I0307 21:42:23.739090 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:23.739392 master-0 kubenswrapper[16352]: I0307 21:42:23.739142 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/218021bb-e4db-42b1-a553-f2a373cd9565-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:23.972764 master-0 kubenswrapper[16352]: I0307 21:42:23.972615 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:23.984965 master-0 kubenswrapper[16352]: I0307 21:42:23.984496 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:23.985107 master-0 kubenswrapper[16352]: E0307 21:42:23.985071 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="218021bb-e4db-42b1-a553-f2a373cd9565" containerName="neutron-db-sync" Mar 07 21:42:23.985107 master-0 kubenswrapper[16352]: I0307 21:42:23.985091 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="218021bb-e4db-42b1-a553-f2a373cd9565" containerName="neutron-db-sync" Mar 07 21:42:23.985370 master-0 kubenswrapper[16352]: I0307 21:42:23.985338 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="218021bb-e4db-42b1-a553-f2a373cd9565" containerName="neutron-db-sync" Mar 07 21:42:23.991258 master-0 kubenswrapper[16352]: I0307 21:42:23.986606 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.079064 master-0 kubenswrapper[16352]: I0307 21:42:24.073181 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156483 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156597 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156798 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156832 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156867 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.161956 master-0 kubenswrapper[16352]: I0307 21:42:24.156903 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b78v6\" (UniqueName: \"kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.219843 master-0 kubenswrapper[16352]: I0307 21:42:24.214699 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:42:24.219843 master-0 kubenswrapper[16352]: I0307 21:42:24.218074 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.223702 master-0 kubenswrapper[16352]: I0307 21:42:24.222343 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 07 21:42:24.239408 master-0 kubenswrapper[16352]: I0307 21:42:24.234378 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 07 21:42:24.239408 master-0 kubenswrapper[16352]: I0307 21:42:24.234664 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.301173 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b78v6\" (UniqueName: \"kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.301348 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.301483 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.301901 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.301969 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.302052 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.302586 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.302629 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.303328 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.303668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.323435 master-0 kubenswrapper[16352]: I0307 21:42:24.303861 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.338204 master-0 kubenswrapper[16352]: I0307 21:42:24.338124 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b78v6\" (UniqueName: \"kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6\") pod \"dnsmasq-dns-7fb78888f7-pwtc8\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.341947 master-0 kubenswrapper[16352]: I0307 21:42:24.341851 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:42:24.407738 master-0 kubenswrapper[16352]: I0307 21:42:24.404458 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.407738 master-0 kubenswrapper[16352]: I0307 21:42:24.404604 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.407738 master-0 kubenswrapper[16352]: I0307 21:42:24.404705 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.407738 master-0 kubenswrapper[16352]: I0307 21:42:24.404743 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.407738 master-0 kubenswrapper[16352]: I0307 21:42:24.404837 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8rdh\" (UniqueName: \"kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.507996 master-0 kubenswrapper[16352]: I0307 21:42:24.507435 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.508391 master-0 kubenswrapper[16352]: I0307 21:42:24.508265 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.511705 master-0 kubenswrapper[16352]: I0307 21:42:24.508633 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.511705 master-0 kubenswrapper[16352]: I0307 21:42:24.508744 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.511705 master-0 kubenswrapper[16352]: I0307 21:42:24.509064 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8rdh\" (UniqueName: \"kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.516706 master-0 kubenswrapper[16352]: I0307 21:42:24.511954 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.516706 master-0 kubenswrapper[16352]: I0307 21:42:24.514825 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.520701 master-0 kubenswrapper[16352]: I0307 21:42:24.519099 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.524702 master-0 kubenswrapper[16352]: I0307 21:42:24.521283 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.535098 master-0 kubenswrapper[16352]: I0307 21:42:24.530660 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8rdh\" (UniqueName: \"kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh\") pod \"neutron-f49f69884-v8xz2\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.599704 master-0 kubenswrapper[16352]: I0307 21:42:24.596176 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:24.618703 master-0 kubenswrapper[16352]: I0307 21:42:24.617772 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:24.696703 master-0 kubenswrapper[16352]: I0307 21:42:24.696399 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerStarted","Data":"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea"} Mar 07 21:42:24.705720 master-0 kubenswrapper[16352]: I0307 21:42:24.703146 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerStarted","Data":"161ae14484b48f26c24528d92a92ab3ea4cd07507499acb11ec2f9706bc4d932"} Mar 07 21:42:24.705720 master-0 kubenswrapper[16352]: I0307 21:42:24.703291 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerStarted","Data":"b822ddef479a10d11d2bfbac6c105b64ecc3912b656c9b8d5c04d08bcf7ebb01"} Mar 07 21:42:24.730983 master-0 kubenswrapper[16352]: I0307 21:42:24.727530 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerStarted","Data":"07e1463666ed4bc8941e56a2fb227bab60740dff683bc71760cfd12a657ad0ab"} Mar 07 21:42:24.734710 master-0 kubenswrapper[16352]: I0307 21:42:24.734097 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerStarted","Data":"0c3281a5a62aa7167b934e02e0ce4b52dd36eac632736db435e2f8e4a252d234"} Mar 07 21:42:24.734710 master-0 kubenswrapper[16352]: I0307 21:42:24.734150 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerStarted","Data":"be583d403489b6e6d7035f5a522e0cb98d638b8ae91cbbd533cf866161a60b68"} Mar 07 21:42:24.879745 master-0 kubenswrapper[16352]: I0307 21:42:24.867558 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-backup-0" podStartSLOduration=2.932916573 podStartE2EDuration="3.867467406s" podCreationTimestamp="2026-03-07 21:42:21 +0000 UTC" firstStartedPulling="2026-03-07 21:42:22.850044101 +0000 UTC m=+1465.920749160" lastFinishedPulling="2026-03-07 21:42:23.784594934 +0000 UTC m=+1466.855299993" observedRunningTime="2026-03-07 21:42:24.820852038 +0000 UTC m=+1467.891557107" watchObservedRunningTime="2026-03-07 21:42:24.867467406 +0000 UTC m=+1467.938172475" Mar 07 21:42:25.373052 master-0 kubenswrapper[16352]: I0307 21:42:25.365992 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-volume-lvm-iscsi-0" podStartSLOduration=4.356142553 podStartE2EDuration="5.365959632s" podCreationTimestamp="2026-03-07 21:42:20 +0000 UTC" firstStartedPulling="2026-03-07 21:42:22.168639695 +0000 UTC m=+1465.239344754" lastFinishedPulling="2026-03-07 21:42:23.178456774 +0000 UTC m=+1466.249161833" observedRunningTime="2026-03-07 21:42:24.990829548 +0000 UTC m=+1468.061534617" watchObservedRunningTime="2026-03-07 21:42:25.365959632 +0000 UTC m=+1468.436664691" Mar 07 21:42:25.395404 master-0 kubenswrapper[16352]: W0307 21:42:25.391144 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc6ec644f_79e0_428a_a260_c2dde4320020.slice/crio-61cac17473ccbb3e78d95ba13f2f4f687e3bd74dce5b3a8a49fe3f1c291ce705 WatchSource:0}: Error finding container 61cac17473ccbb3e78d95ba13f2f4f687e3bd74dce5b3a8a49fe3f1c291ce705: Status 404 returned error can't find the container with id 61cac17473ccbb3e78d95ba13f2f4f687e3bd74dce5b3a8a49fe3f1c291ce705 Mar 07 21:42:25.405973 master-0 kubenswrapper[16352]: I0307 21:42:25.405809 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:25.627268 master-0 kubenswrapper[16352]: I0307 21:42:25.627208 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:42:25.762029 master-0 kubenswrapper[16352]: I0307 21:42:25.761948 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerStarted","Data":"df24f3966ee9c06e2387b5a4896a043900b48426a93d9c0a9c63c6527a518b8b"} Mar 07 21:42:25.766433 master-0 kubenswrapper[16352]: I0307 21:42:25.766203 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerStarted","Data":"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1"} Mar 07 21:42:25.766736 master-0 kubenswrapper[16352]: I0307 21:42:25.766504 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-api-0" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-86971-api-log" containerID="cri-o://39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" gracePeriod=30 Mar 07 21:42:25.766736 master-0 kubenswrapper[16352]: I0307 21:42:25.766652 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-86971-api-0" Mar 07 21:42:25.766821 master-0 kubenswrapper[16352]: I0307 21:42:25.766657 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-api-0" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-api" containerID="cri-o://78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" gracePeriod=30 Mar 07 21:42:25.773105 master-0 kubenswrapper[16352]: I0307 21:42:25.773024 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" event={"ID":"c6ec644f-79e0-428a-a260-c2dde4320020","Type":"ContainerStarted","Data":"61cac17473ccbb3e78d95ba13f2f4f687e3bd74dce5b3a8a49fe3f1c291ce705"} Mar 07 21:42:25.774832 master-0 kubenswrapper[16352]: I0307 21:42:25.774759 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerStarted","Data":"1517a90c24294061c003125fcc14aa3418e09b97326300ca39ca2db7281f1f96"} Mar 07 21:42:25.775458 master-0 kubenswrapper[16352]: I0307 21:42:25.775392 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="dnsmasq-dns" containerID="cri-o://d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7" gracePeriod=10 Mar 07 21:42:25.808707 master-0 kubenswrapper[16352]: I0307 21:42:25.803469 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-scheduler-0" podStartSLOduration=4.929332512 podStartE2EDuration="5.803439214s" podCreationTimestamp="2026-03-07 21:42:20 +0000 UTC" firstStartedPulling="2026-03-07 21:42:22.029543836 +0000 UTC m=+1465.100248895" lastFinishedPulling="2026-03-07 21:42:22.903650538 +0000 UTC m=+1465.974355597" observedRunningTime="2026-03-07 21:42:25.796421156 +0000 UTC m=+1468.867126225" watchObservedRunningTime="2026-03-07 21:42:25.803439214 +0000 UTC m=+1468.874144263" Mar 07 21:42:25.855224 master-0 kubenswrapper[16352]: I0307 21:42:25.855058 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-api-0" podStartSLOduration=4.855003362 podStartE2EDuration="4.855003362s" podCreationTimestamp="2026-03-07 21:42:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:25.82951102 +0000 UTC m=+1468.900216079" watchObservedRunningTime="2026-03-07 21:42:25.855003362 +0000 UTC m=+1468.925708431" Mar 07 21:42:26.305335 master-0 kubenswrapper[16352]: I0307 21:42:26.303991 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:26.518326 master-0 kubenswrapper[16352]: I0307 21:42:26.508567 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:26.548829 master-0 kubenswrapper[16352]: I0307 21:42:26.548484 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:26.602615 master-0 kubenswrapper[16352]: I0307 21:42:26.602534 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjnj9\" (UniqueName: \"kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.602793 master-0 kubenswrapper[16352]: I0307 21:42:26.602629 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.602793 master-0 kubenswrapper[16352]: I0307 21:42:26.602733 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.602871 master-0 kubenswrapper[16352]: I0307 21:42:26.602827 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.603025 master-0 kubenswrapper[16352]: I0307 21:42:26.602998 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.603071 master-0 kubenswrapper[16352]: I0307 21:42:26.603031 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb\") pod \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\" (UID: \"a962b1a6-daaa-4ffe-9395-419843cd2a6f\") " Mar 07 21:42:26.622007 master-0 kubenswrapper[16352]: I0307 21:42:26.621942 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9" (OuterVolumeSpecName: "kube-api-access-rjnj9") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "kube-api-access-rjnj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:26.694114 master-0 kubenswrapper[16352]: I0307 21:42:26.692577 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:26.695083 master-0 kubenswrapper[16352]: I0307 21:42:26.694950 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:26.706696 master-0 kubenswrapper[16352]: I0307 21:42:26.706592 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.706696 master-0 kubenswrapper[16352]: I0307 21:42:26.706649 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rjnj9\" (UniqueName: \"kubernetes.io/projected/a962b1a6-daaa-4ffe-9395-419843cd2a6f-kube-api-access-rjnj9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.716358 master-0 kubenswrapper[16352]: I0307 21:42:26.716211 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config" (OuterVolumeSpecName: "config") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:26.716755 master-0 kubenswrapper[16352]: I0307 21:42:26.716677 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:26.729569 master-0 kubenswrapper[16352]: I0307 21:42:26.728833 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:26.746835 master-0 kubenswrapper[16352]: I0307 21:42:26.745542 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a962b1a6-daaa-4ffe-9395-419843cd2a6f" (UID: "a962b1a6-daaa-4ffe-9395-419843cd2a6f"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.800844 16352 generic.go:334] "Generic (PLEG): container finished" podID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerID="78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" exitCode=0 Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.800906 16352 generic.go:334] "Generic (PLEG): container finished" podID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerID="39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" exitCode=143 Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.800978 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerDied","Data":"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1"} Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.801028 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerDied","Data":"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea"} Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.801042 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"d48ddd87-c87f-44a9-8cdd-a41bf391b97c","Type":"ContainerDied","Data":"4740bf7f572a2694b4a5f63ba5ef9cafdfc32c7c23a55ca06e8b05e62f2e41ad"} Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.801064 16352 scope.go:117] "RemoveContainer" containerID="78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" Mar 07 21:42:26.808823 master-0 kubenswrapper[16352]: I0307 21:42:26.801221 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811093 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2ks5v\" (UniqueName: \"kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811249 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811316 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811356 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811501 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811742 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.811784 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs\") pod \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\" (UID: \"d48ddd87-c87f-44a9-8cdd-a41bf391b97c\") " Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.812971 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.812998 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.813014 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.813027 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a962b1a6-daaa-4ffe-9395-419843cd2a6f-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.815968 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v" (OuterVolumeSpecName: "kube-api-access-2ks5v") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "kube-api-access-2ks5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.816017 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts" (OuterVolumeSpecName: "scripts") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.816034 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.823364 16352 generic.go:334] "Generic (PLEG): container finished" podID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerID="d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7" exitCode=0 Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.823442 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" event={"ID":"a962b1a6-daaa-4ffe-9395-419843cd2a6f","Type":"ContainerDied","Data":"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7"} Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.823474 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" event={"ID":"a962b1a6-daaa-4ffe-9395-419843cd2a6f","Type":"ContainerDied","Data":"4e902407f8d6ba830ab9903b784ff4c613eb648cec02845b509b45a66f941ccb"} Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.823544 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-589dd8c5c-bm6b7" Mar 07 21:42:26.828769 master-0 kubenswrapper[16352]: I0307 21:42:26.824091 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs" (OuterVolumeSpecName: "logs") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:26.847836 master-0 kubenswrapper[16352]: I0307 21:42:26.841822 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:26.913601 master-0 kubenswrapper[16352]: I0307 21:42:26.901158 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:26.913601 master-0 kubenswrapper[16352]: I0307 21:42:26.901504 16352 generic.go:334] "Generic (PLEG): container finished" podID="c6ec644f-79e0-428a-a260-c2dde4320020" containerID="a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55" exitCode=0 Mar 07 21:42:26.913601 master-0 kubenswrapper[16352]: I0307 21:42:26.901605 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" event={"ID":"c6ec644f-79e0-428a-a260-c2dde4320020","Type":"ContainerDied","Data":"a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55"} Mar 07 21:42:26.913601 master-0 kubenswrapper[16352]: I0307 21:42:26.903254 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920361 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2ks5v\" (UniqueName: \"kubernetes.io/projected/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-kube-api-access-2ks5v\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920456 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920475 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920488 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920507 16352 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.931906 master-0 kubenswrapper[16352]: I0307 21:42:26.920520 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:26.955516 master-0 kubenswrapper[16352]: I0307 21:42:26.952741 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerStarted","Data":"71178167e291da054ac2196c37ee9b7fc49465fe3da171034ec208c32399bdcf"} Mar 07 21:42:26.955516 master-0 kubenswrapper[16352]: I0307 21:42:26.952828 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerStarted","Data":"fa9fa60fa174582ce2847403b83e63de271e277c86bae70067df78a0f025b6ee"} Mar 07 21:42:26.955516 master-0 kubenswrapper[16352]: I0307 21:42:26.953605 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:26.965833 master-0 kubenswrapper[16352]: I0307 21:42:26.962922 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data" (OuterVolumeSpecName: "config-data") pod "d48ddd87-c87f-44a9-8cdd-a41bf391b97c" (UID: "d48ddd87-c87f-44a9-8cdd-a41bf391b97c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:27.038834 master-0 kubenswrapper[16352]: I0307 21:42:27.034318 16352 scope.go:117] "RemoveContainer" containerID="39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" Mar 07 21:42:27.051910 master-0 kubenswrapper[16352]: I0307 21:42:27.051827 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d48ddd87-c87f-44a9-8cdd-a41bf391b97c-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:27.250597 master-0 kubenswrapper[16352]: I0307 21:42:27.250528 16352 scope.go:117] "RemoveContainer" containerID="78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" Mar 07 21:42:27.251449 master-0 kubenswrapper[16352]: E0307 21:42:27.251418 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1\": container with ID starting with 78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1 not found: ID does not exist" containerID="78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" Mar 07 21:42:27.251543 master-0 kubenswrapper[16352]: I0307 21:42:27.251474 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1"} err="failed to get container status \"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1\": rpc error: code = NotFound desc = could not find container \"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1\": container with ID starting with 78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1 not found: ID does not exist" Mar 07 21:42:27.251543 master-0 kubenswrapper[16352]: I0307 21:42:27.251502 16352 scope.go:117] "RemoveContainer" containerID="39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" Mar 07 21:42:27.252556 master-0 kubenswrapper[16352]: E0307 21:42:27.251840 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea\": container with ID starting with 39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea not found: ID does not exist" containerID="39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" Mar 07 21:42:27.252556 master-0 kubenswrapper[16352]: I0307 21:42:27.251875 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea"} err="failed to get container status \"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea\": rpc error: code = NotFound desc = could not find container \"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea\": container with ID starting with 39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea not found: ID does not exist" Mar 07 21:42:27.252556 master-0 kubenswrapper[16352]: I0307 21:42:27.251893 16352 scope.go:117] "RemoveContainer" containerID="78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1" Mar 07 21:42:27.278882 master-0 kubenswrapper[16352]: I0307 21:42:27.253639 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1"} err="failed to get container status \"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1\": rpc error: code = NotFound desc = could not find container \"78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1\": container with ID starting with 78e5681081c4604d60996223b75f3ebdc4c49134c220e0d6730af5b6ab5c98f1 not found: ID does not exist" Mar 07 21:42:27.278882 master-0 kubenswrapper[16352]: I0307 21:42:27.274777 16352 scope.go:117] "RemoveContainer" containerID="39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea" Mar 07 21:42:27.278882 master-0 kubenswrapper[16352]: I0307 21:42:27.274401 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:27.278882 master-0 kubenswrapper[16352]: I0307 21:42:27.274978 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-589dd8c5c-bm6b7"] Mar 07 21:42:27.278882 master-0 kubenswrapper[16352]: I0307 21:42:27.275883 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-f49f69884-v8xz2" podStartSLOduration=3.275849248 podStartE2EDuration="3.275849248s" podCreationTimestamp="2026-03-07 21:42:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:27.200064308 +0000 UTC m=+1470.270769367" watchObservedRunningTime="2026-03-07 21:42:27.275849248 +0000 UTC m=+1470.346554307" Mar 07 21:42:27.280619 master-0 kubenswrapper[16352]: I0307 21:42:27.280546 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea"} err="failed to get container status \"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea\": rpc error: code = NotFound desc = could not find container \"39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea\": container with ID starting with 39b42ccc893def2e615f8b6b627478022298e151f4cef8f2ee70ed7d7b268fea not found: ID does not exist" Mar 07 21:42:27.280619 master-0 kubenswrapper[16352]: I0307 21:42:27.280619 16352 scope.go:117] "RemoveContainer" containerID="d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7" Mar 07 21:42:27.330716 master-0 kubenswrapper[16352]: I0307 21:42:27.323484 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:27.337422 master-0 kubenswrapper[16352]: I0307 21:42:27.337295 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:27.359316 master-0 kubenswrapper[16352]: I0307 21:42:27.359251 16352 scope.go:117] "RemoveContainer" containerID="998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3" Mar 07 21:42:27.371893 master-0 kubenswrapper[16352]: I0307 21:42:27.371767 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:27.372589 master-0 kubenswrapper[16352]: E0307 21:42:27.372551 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="dnsmasq-dns" Mar 07 21:42:27.372589 master-0 kubenswrapper[16352]: I0307 21:42:27.372582 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="dnsmasq-dns" Mar 07 21:42:27.372671 master-0 kubenswrapper[16352]: E0307 21:42:27.372615 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="init" Mar 07 21:42:27.372671 master-0 kubenswrapper[16352]: I0307 21:42:27.372624 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="init" Mar 07 21:42:27.372671 master-0 kubenswrapper[16352]: E0307 21:42:27.372663 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-api" Mar 07 21:42:27.372671 master-0 kubenswrapper[16352]: I0307 21:42:27.372671 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-api" Mar 07 21:42:27.372846 master-0 kubenswrapper[16352]: E0307 21:42:27.372724 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-86971-api-log" Mar 07 21:42:27.372846 master-0 kubenswrapper[16352]: I0307 21:42:27.372733 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-86971-api-log" Mar 07 21:42:27.373156 master-0 kubenswrapper[16352]: I0307 21:42:27.372992 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-api" Mar 07 21:42:27.373156 master-0 kubenswrapper[16352]: I0307 21:42:27.373039 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" containerName="cinder-86971-api-log" Mar 07 21:42:27.373156 master-0 kubenswrapper[16352]: I0307 21:42:27.373065 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" containerName="dnsmasq-dns" Mar 07 21:42:27.375514 master-0 kubenswrapper[16352]: I0307 21:42:27.374437 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.377897 master-0 kubenswrapper[16352]: I0307 21:42:27.377833 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 07 21:42:27.378705 master-0 kubenswrapper[16352]: I0307 21:42:27.378666 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 07 21:42:27.382483 master-0 kubenswrapper[16352]: I0307 21:42:27.379881 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-api-config-data" Mar 07 21:42:27.382775 master-0 kubenswrapper[16352]: I0307 21:42:27.382741 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:27.406312 master-0 kubenswrapper[16352]: I0307 21:42:27.406033 16352 scope.go:117] "RemoveContainer" containerID="d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7" Mar 07 21:42:27.414194 master-0 kubenswrapper[16352]: E0307 21:42:27.414044 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7\": container with ID starting with d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7 not found: ID does not exist" containerID="d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7" Mar 07 21:42:27.414194 master-0 kubenswrapper[16352]: I0307 21:42:27.414098 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7"} err="failed to get container status \"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7\": rpc error: code = NotFound desc = could not find container \"d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7\": container with ID starting with d283e7e2ba12bb9f842c63db0e9afa9e6195bd6946c719473664a9903b1d98d7 not found: ID does not exist" Mar 07 21:42:27.414194 master-0 kubenswrapper[16352]: I0307 21:42:27.414129 16352 scope.go:117] "RemoveContainer" containerID="998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3" Mar 07 21:42:27.428581 master-0 kubenswrapper[16352]: E0307 21:42:27.428040 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3\": container with ID starting with 998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3 not found: ID does not exist" containerID="998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3" Mar 07 21:42:27.428581 master-0 kubenswrapper[16352]: I0307 21:42:27.428119 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3"} err="failed to get container status \"998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3\": rpc error: code = NotFound desc = could not find container \"998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3\": container with ID starting with 998446b05f33fea4fca8b771ff10adedc5e78f0f5f3ffe19ceba3348a4eed2e3 not found: ID does not exist" Mar 07 21:42:27.487960 master-0 kubenswrapper[16352]: I0307 21:42:27.487769 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.488861 master-0 kubenswrapper[16352]: I0307 21:42:27.488833 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-public-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.489207 master-0 kubenswrapper[16352]: I0307 21:42:27.489190 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/253c4b63-bf30-4275-9396-8a899301a4b9-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.489375 master-0 kubenswrapper[16352]: I0307 21:42:27.489357 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.489601 master-0 kubenswrapper[16352]: I0307 21:42:27.489585 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253c4b63-bf30-4275-9396-8a899301a4b9-logs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.489864 master-0 kubenswrapper[16352]: I0307 21:42:27.489819 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-scripts\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.490057 master-0 kubenswrapper[16352]: I0307 21:42:27.490042 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-internal-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.490456 master-0 kubenswrapper[16352]: I0307 21:42:27.490428 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwnnj\" (UniqueName: \"kubernetes.io/projected/253c4b63-bf30-4275-9396-8a899301a4b9-kube-api-access-lwnnj\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.490599 master-0 kubenswrapper[16352]: I0307 21:42:27.490585 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592524 master-0 kubenswrapper[16352]: I0307 21:42:27.592426 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-scripts\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592524 master-0 kubenswrapper[16352]: I0307 21:42:27.592520 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-internal-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592582 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwnnj\" (UniqueName: \"kubernetes.io/projected/253c4b63-bf30-4275-9396-8a899301a4b9-kube-api-access-lwnnj\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592628 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592656 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592766 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-public-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592828 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/253c4b63-bf30-4275-9396-8a899301a4b9-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.592883 master-0 kubenswrapper[16352]: I0307 21:42:27.592846 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.593075 master-0 kubenswrapper[16352]: I0307 21:42:27.592889 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253c4b63-bf30-4275-9396-8a899301a4b9-logs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.593510 master-0 kubenswrapper[16352]: I0307 21:42:27.593466 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/253c4b63-bf30-4275-9396-8a899301a4b9-logs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.595042 master-0 kubenswrapper[16352]: I0307 21:42:27.594282 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/253c4b63-bf30-4275-9396-8a899301a4b9-etc-machine-id\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.598117 master-0 kubenswrapper[16352]: I0307 21:42:27.598069 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-public-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.598407 master-0 kubenswrapper[16352]: I0307 21:42:27.598365 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data-custom\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.598611 master-0 kubenswrapper[16352]: I0307 21:42:27.598587 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-internal-tls-certs\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.598886 master-0 kubenswrapper[16352]: I0307 21:42:27.598862 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-scripts\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.599847 master-0 kubenswrapper[16352]: I0307 21:42:27.599668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-config-data\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.601737 master-0 kubenswrapper[16352]: I0307 21:42:27.601717 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/253c4b63-bf30-4275-9396-8a899301a4b9-combined-ca-bundle\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.617185 master-0 kubenswrapper[16352]: I0307 21:42:27.617102 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwnnj\" (UniqueName: \"kubernetes.io/projected/253c4b63-bf30-4275-9396-8a899301a4b9-kube-api-access-lwnnj\") pod \"cinder-86971-api-0\" (UID: \"253c4b63-bf30-4275-9396-8a899301a4b9\") " pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.753570 master-0 kubenswrapper[16352]: I0307 21:42:27.753405 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-api-0" Mar 07 21:42:27.974734 master-0 kubenswrapper[16352]: I0307 21:42:27.974670 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" event={"ID":"c6ec644f-79e0-428a-a260-c2dde4320020","Type":"ContainerStarted","Data":"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63"} Mar 07 21:42:27.975348 master-0 kubenswrapper[16352]: I0307 21:42:27.975050 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:28.212727 master-0 kubenswrapper[16352]: I0307 21:42:28.211679 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" podStartSLOduration=5.21165017 podStartE2EDuration="5.21165017s" podCreationTimestamp="2026-03-07 21:42:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:28.195881442 +0000 UTC m=+1471.266586521" watchObservedRunningTime="2026-03-07 21:42:28.21165017 +0000 UTC m=+1471.282355239" Mar 07 21:42:28.233948 master-0 kubenswrapper[16352]: I0307 21:42:28.233835 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-fd8d8c7c7-w5vwh"] Mar 07 21:42:28.242533 master-0 kubenswrapper[16352]: I0307 21:42:28.240818 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.256729 master-0 kubenswrapper[16352]: I0307 21:42:28.254002 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 07 21:42:28.256729 master-0 kubenswrapper[16352]: I0307 21:42:28.254390 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 07 21:42:28.288711 master-0 kubenswrapper[16352]: I0307 21:42:28.285424 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fd8d8c7c7-w5vwh"] Mar 07 21:42:28.351229 master-0 kubenswrapper[16352]: I0307 21:42:28.341440 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-api-0"] Mar 07 21:42:28.358875 master-0 kubenswrapper[16352]: W0307 21:42:28.353821 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod253c4b63_bf30_4275_9396_8a899301a4b9.slice/crio-346dfeb58fd0ad12479d9b5df0cf8841d8c4347667cd28e7235905ed6adbfea7 WatchSource:0}: Error finding container 346dfeb58fd0ad12479d9b5df0cf8841d8c4347667cd28e7235905ed6adbfea7: Status 404 returned error can't find the container with id 346dfeb58fd0ad12479d9b5df0cf8841d8c4347667cd28e7235905ed6adbfea7 Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364013 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364175 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-public-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364233 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-combined-ca-bundle\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364303 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-httpd-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364597 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-internal-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364729 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qv5n\" (UniqueName: \"kubernetes.io/projected/b995f4e5-10ab-47d2-939a-cb78465e3ea5-kube-api-access-9qv5n\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.370101 master-0 kubenswrapper[16352]: I0307 21:42:28.364856 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-ovndb-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467400 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-internal-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467468 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qv5n\" (UniqueName: \"kubernetes.io/projected/b995f4e5-10ab-47d2-939a-cb78465e3ea5-kube-api-access-9qv5n\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467518 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-ovndb-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467575 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467631 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-public-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467665 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-combined-ca-bundle\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.467934 master-0 kubenswrapper[16352]: I0307 21:42:28.467730 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-httpd-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.479811 master-0 kubenswrapper[16352]: I0307 21:42:28.472104 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-httpd-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.479811 master-0 kubenswrapper[16352]: I0307 21:42:28.474065 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-combined-ca-bundle\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.479811 master-0 kubenswrapper[16352]: I0307 21:42:28.477753 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-internal-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.479811 master-0 kubenswrapper[16352]: I0307 21:42:28.479668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-public-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.492219 master-0 kubenswrapper[16352]: I0307 21:42:28.482246 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-config\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.496944 master-0 kubenswrapper[16352]: I0307 21:42:28.495827 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qv5n\" (UniqueName: \"kubernetes.io/projected/b995f4e5-10ab-47d2-939a-cb78465e3ea5-kube-api-access-9qv5n\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.502489 master-0 kubenswrapper[16352]: I0307 21:42:28.502421 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/b995f4e5-10ab-47d2-939a-cb78465e3ea5-ovndb-tls-certs\") pod \"neutron-fd8d8c7c7-w5vwh\" (UID: \"b995f4e5-10ab-47d2-939a-cb78465e3ea5\") " pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:28.620236 master-0 kubenswrapper[16352]: I0307 21:42:28.619317 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:29.034297 master-0 kubenswrapper[16352]: I0307 21:42:29.034217 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"253c4b63-bf30-4275-9396-8a899301a4b9","Type":"ContainerStarted","Data":"346dfeb58fd0ad12479d9b5df0cf8841d8c4347667cd28e7235905ed6adbfea7"} Mar 07 21:42:29.207932 master-0 kubenswrapper[16352]: I0307 21:42:29.207853 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a962b1a6-daaa-4ffe-9395-419843cd2a6f" path="/var/lib/kubelet/pods/a962b1a6-daaa-4ffe-9395-419843cd2a6f/volumes" Mar 07 21:42:29.208638 master-0 kubenswrapper[16352]: I0307 21:42:29.208605 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d48ddd87-c87f-44a9-8cdd-a41bf391b97c" path="/var/lib/kubelet/pods/d48ddd87-c87f-44a9-8cdd-a41bf391b97c/volumes" Mar 07 21:42:29.321265 master-0 kubenswrapper[16352]: I0307 21:42:29.318223 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-fd8d8c7c7-w5vwh"] Mar 07 21:42:29.321711 master-0 kubenswrapper[16352]: W0307 21:42:29.321612 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb995f4e5_10ab_47d2_939a_cb78465e3ea5.slice/crio-83ca4edb3ce70f3e9ce34555bf3fdae88d1eaf89203e4b9b8f06f93f3eb21f15 WatchSource:0}: Error finding container 83ca4edb3ce70f3e9ce34555bf3fdae88d1eaf89203e4b9b8f06f93f3eb21f15: Status 404 returned error can't find the container with id 83ca4edb3ce70f3e9ce34555bf3fdae88d1eaf89203e4b9b8f06f93f3eb21f15 Mar 07 21:42:30.069616 master-0 kubenswrapper[16352]: I0307 21:42:30.069527 16352 generic.go:334] "Generic (PLEG): container finished" podID="2a6736c6-a65f-4821-91a1-747418c62459" containerID="6cdac522b3b5a0c31b3bde57331a76e735d384d73d686550f0182c05c512a505" exitCode=0 Mar 07 21:42:30.070230 master-0 kubenswrapper[16352]: I0307 21:42:30.069654 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-mtvqh" event={"ID":"2a6736c6-a65f-4821-91a1-747418c62459","Type":"ContainerDied","Data":"6cdac522b3b5a0c31b3bde57331a76e735d384d73d686550f0182c05c512a505"} Mar 07 21:42:30.072646 master-0 kubenswrapper[16352]: I0307 21:42:30.072597 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"253c4b63-bf30-4275-9396-8a899301a4b9","Type":"ContainerStarted","Data":"bc1dd97c393dc696216c7075b626e8263955c4444da283b5f3eadc8540ec3744"} Mar 07 21:42:30.078700 master-0 kubenswrapper[16352]: I0307 21:42:30.075198 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fd8d8c7c7-w5vwh" event={"ID":"b995f4e5-10ab-47d2-939a-cb78465e3ea5","Type":"ContainerStarted","Data":"56b98b9586a48f63c22ce23c0c0d20c514ac2f5a4c9fed6e33c4b6b3b3e26a17"} Mar 07 21:42:30.078700 master-0 kubenswrapper[16352]: I0307 21:42:30.075260 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fd8d8c7c7-w5vwh" event={"ID":"b995f4e5-10ab-47d2-939a-cb78465e3ea5","Type":"ContainerStarted","Data":"60575e1abec726db64243529abe7900e05876569bbde7577c170b0f4566c9777"} Mar 07 21:42:30.078700 master-0 kubenswrapper[16352]: I0307 21:42:30.075273 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-fd8d8c7c7-w5vwh" event={"ID":"b995f4e5-10ab-47d2-939a-cb78465e3ea5","Type":"ContainerStarted","Data":"83ca4edb3ce70f3e9ce34555bf3fdae88d1eaf89203e4b9b8f06f93f3eb21f15"} Mar 07 21:42:30.078700 master-0 kubenswrapper[16352]: I0307 21:42:30.075384 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:30.126764 master-0 kubenswrapper[16352]: I0307 21:42:30.126644 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-fd8d8c7c7-w5vwh" podStartSLOduration=3.126618737 podStartE2EDuration="3.126618737s" podCreationTimestamp="2026-03-07 21:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:30.125850349 +0000 UTC m=+1473.196555398" watchObservedRunningTime="2026-03-07 21:42:30.126618737 +0000 UTC m=+1473.197323806" Mar 07 21:42:31.095853 master-0 kubenswrapper[16352]: I0307 21:42:31.095770 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-api-0" event={"ID":"253c4b63-bf30-4275-9396-8a899301a4b9","Type":"ContainerStarted","Data":"01675c82cbd09555bf4f122fabf14177d28c9b214ff7dcacc67846018f4a0807"} Mar 07 21:42:31.096659 master-0 kubenswrapper[16352]: I0307 21:42:31.096171 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-86971-api-0" Mar 07 21:42:31.133805 master-0 kubenswrapper[16352]: I0307 21:42:31.133634 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-api-0" podStartSLOduration=4.133246011 podStartE2EDuration="4.133246011s" podCreationTimestamp="2026-03-07 21:42:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:31.128921996 +0000 UTC m=+1474.199627065" watchObservedRunningTime="2026-03-07 21:42:31.133246011 +0000 UTC m=+1474.203951090" Mar 07 21:42:31.554508 master-0 kubenswrapper[16352]: I0307 21:42:31.554431 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:31.627837 master-0 kubenswrapper[16352]: I0307 21:42:31.627668 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:31.643260 master-0 kubenswrapper[16352]: I0307 21:42:31.643209 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:42:31.796802 master-0 kubenswrapper[16352]: I0307 21:42:31.796638 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.796802 master-0 kubenswrapper[16352]: I0307 21:42:31.796742 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2nv9\" (UniqueName: \"kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.797091 master-0 kubenswrapper[16352]: I0307 21:42:31.796981 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.798034 master-0 kubenswrapper[16352]: I0307 21:42:31.797164 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.798034 master-0 kubenswrapper[16352]: I0307 21:42:31.797212 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.798034 master-0 kubenswrapper[16352]: I0307 21:42:31.797241 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo\") pod \"2a6736c6-a65f-4821-91a1-747418c62459\" (UID: \"2a6736c6-a65f-4821-91a1-747418c62459\") " Mar 07 21:42:31.799702 master-0 kubenswrapper[16352]: I0307 21:42:31.799630 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:31.801741 master-0 kubenswrapper[16352]: I0307 21:42:31.801637 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 21:42:31.807451 master-0 kubenswrapper[16352]: I0307 21:42:31.807379 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts" (OuterVolumeSpecName: "scripts") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:31.807451 master-0 kubenswrapper[16352]: I0307 21:42:31.807445 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9" (OuterVolumeSpecName: "kube-api-access-b2nv9") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "kube-api-access-b2nv9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:31.876238 master-0 kubenswrapper[16352]: I0307 21:42:31.876041 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data" (OuterVolumeSpecName: "config-data") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:31.878532 master-0 kubenswrapper[16352]: I0307 21:42:31.878006 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:31.891060 master-0 kubenswrapper[16352]: I0307 21:42:31.890966 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2a6736c6-a65f-4821-91a1-747418c62459" (UID: "2a6736c6-a65f-4821-91a1-747418c62459"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906171 16352 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/2a6736c6-a65f-4821-91a1-747418c62459-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906244 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906257 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2nv9\" (UniqueName: \"kubernetes.io/projected/2a6736c6-a65f-4821-91a1-747418c62459-kube-api-access-b2nv9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906269 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906277 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2a6736c6-a65f-4821-91a1-747418c62459-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.906776 master-0 kubenswrapper[16352]: I0307 21:42:31.906287 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/2a6736c6-a65f-4821-91a1-747418c62459-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:31.977709 master-0 kubenswrapper[16352]: I0307 21:42:31.972156 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:32.120808 master-0 kubenswrapper[16352]: I0307 21:42:32.120656 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-db-sync-mtvqh" event={"ID":"2a6736c6-a65f-4821-91a1-747418c62459","Type":"ContainerDied","Data":"d400d729dcdfdb910c4362701c566ac62af786a6b20da71b6e9400fe70e62280"} Mar 07 21:42:32.120808 master-0 kubenswrapper[16352]: I0307 21:42:32.120733 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d400d729dcdfdb910c4362701c566ac62af786a6b20da71b6e9400fe70e62280" Mar 07 21:42:32.121551 master-0 kubenswrapper[16352]: I0307 21:42:32.120859 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-scheduler-0" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="cinder-scheduler" containerID="cri-o://07e1463666ed4bc8941e56a2fb227bab60740dff683bc71760cfd12a657ad0ab" gracePeriod=30 Mar 07 21:42:32.121551 master-0 kubenswrapper[16352]: I0307 21:42:32.121495 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-db-sync-mtvqh" Mar 07 21:42:32.123201 master-0 kubenswrapper[16352]: I0307 21:42:32.123136 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-volume-lvm-iscsi-0" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="cinder-volume" containerID="cri-o://0c3281a5a62aa7167b934e02e0ce4b52dd36eac632736db435e2f8e4a252d234" gracePeriod=30 Mar 07 21:42:32.128021 master-0 kubenswrapper[16352]: I0307 21:42:32.126844 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-scheduler-0" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="probe" containerID="cri-o://df24f3966ee9c06e2387b5a4896a043900b48426a93d9c0a9c63c6527a518b8b" gracePeriod=30 Mar 07 21:42:32.128021 master-0 kubenswrapper[16352]: I0307 21:42:32.123391 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-volume-lvm-iscsi-0" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="probe" containerID="cri-o://be583d403489b6e6d7035f5a522e0cb98d638b8ae91cbbd533cf866161a60b68" gracePeriod=30 Mar 07 21:42:32.160610 master-0 kubenswrapper[16352]: I0307 21:42:32.160534 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:32.282726 master-0 kubenswrapper[16352]: I0307 21:42:32.282593 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: I0307 21:42:32.609389 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-create-sdzv8"] Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: E0307 21:42:32.610020 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6736c6-a65f-4821-91a1-747418c62459" containerName="init" Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: I0307 21:42:32.610037 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6736c6-a65f-4821-91a1-747418c62459" containerName="init" Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: E0307 21:42:32.610059 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2a6736c6-a65f-4821-91a1-747418c62459" containerName="ironic-db-sync" Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: I0307 21:42:32.610066 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2a6736c6-a65f-4821-91a1-747418c62459" containerName="ironic-db-sync" Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: I0307 21:42:32.610303 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2a6736c6-a65f-4821-91a1-747418c62459" containerName="ironic-db-sync" Mar 07 21:42:32.616264 master-0 kubenswrapper[16352]: I0307 21:42:32.611164 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.736981 master-0 kubenswrapper[16352]: I0307 21:42:32.736870 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-sdzv8"] Mar 07 21:42:32.767486 master-0 kubenswrapper[16352]: I0307 21:42:32.763619 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4rxz\" (UniqueName: \"kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.767486 master-0 kubenswrapper[16352]: I0307 21:42:32.763798 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.875253 master-0 kubenswrapper[16352]: I0307 21:42:32.875185 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.875500 master-0 kubenswrapper[16352]: I0307 21:42:32.875461 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4rxz\" (UniqueName: \"kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.877142 master-0 kubenswrapper[16352]: I0307 21:42:32.877098 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:32.879471 master-0 kubenswrapper[16352]: I0307 21:42:32.879016 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-neutron-agent-89874fdc8-kjtzj"] Mar 07 21:42:32.880846 master-0 kubenswrapper[16352]: I0307 21:42:32.880810 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:32.891120 master-0 kubenswrapper[16352]: I0307 21:42:32.889468 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-ironic-neutron-agent-config-data" Mar 07 21:42:32.901643 master-0 kubenswrapper[16352]: I0307 21:42:32.901011 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-89874fdc8-kjtzj"] Mar 07 21:42:32.940533 master-0 kubenswrapper[16352]: I0307 21:42:32.940285 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:32.940836 master-0 kubenswrapper[16352]: I0307 21:42:32.940672 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="dnsmasq-dns" containerID="cri-o://dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63" gracePeriod=10 Mar 07 21:42:32.960489 master-0 kubenswrapper[16352]: I0307 21:42:32.956913 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-d904-account-create-update-tc485"] Mar 07 21:42:32.979068 master-0 kubenswrapper[16352]: I0307 21:42:32.978911 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-d904-account-create-update-tc485"] Mar 07 21:42:32.979068 master-0 kubenswrapper[16352]: I0307 21:42:32.979035 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:32.986073 master-0 kubenswrapper[16352]: I0307 21:42:32.985984 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:32.995171 master-0 kubenswrapper[16352]: I0307 21:42:32.995105 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-db-secret" Mar 07 21:42:33.013786 master-0 kubenswrapper[16352]: I0307 21:42:33.001732 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4rxz\" (UniqueName: \"kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz\") pod \"ironic-inspector-db-create-sdzv8\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:33.019215 master-0 kubenswrapper[16352]: I0307 21:42:33.018203 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:42:33.022752 master-0 kubenswrapper[16352]: I0307 21:42:33.021351 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.078447 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.080340 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-combined-ca-bundle\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.080435 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-config\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.080482 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwhf2\" (UniqueName: \"kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.080518 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8pth\" (UniqueName: \"kubernetes.io/projected/55b7e31a-1da5-4528-b904-db7de86e1f26-kube-api-access-j8pth\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.080886 master-0 kubenswrapper[16352]: I0307 21:42:33.080590 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.151063 master-0 kubenswrapper[16352]: I0307 21:42:33.145529 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:33.164876 master-0 kubenswrapper[16352]: I0307 21:42:33.164090 16352 generic.go:334] "Generic (PLEG): container finished" podID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerID="0c3281a5a62aa7167b934e02e0ce4b52dd36eac632736db435e2f8e4a252d234" exitCode=0 Mar 07 21:42:33.164876 master-0 kubenswrapper[16352]: I0307 21:42:33.164377 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-backup-0" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="cinder-backup" containerID="cri-o://b822ddef479a10d11d2bfbac6c105b64ecc3912b656c9b8d5c04d08bcf7ebb01" gracePeriod=30 Mar 07 21:42:33.164876 master-0 kubenswrapper[16352]: I0307 21:42:33.164663 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-86971-backup-0" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="probe" containerID="cri-o://161ae14484b48f26c24528d92a92ab3ea4cd07507499acb11ec2f9706bc4d932" gracePeriod=30 Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.167043 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerDied","Data":"0c3281a5a62aa7167b934e02e0ce4b52dd36eac632736db435e2f8e4a252d234"} Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.167852 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.171793 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-scripts" Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.172019 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.172161 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-api-config-data" Mar 07 21:42:33.172311 master-0 kubenswrapper[16352]: I0307 21:42:33.172270 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-transport" Mar 07 21:42:33.172810 master-0 kubenswrapper[16352]: I0307 21:42:33.172388 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-config-data" Mar 07 21:42:33.186877 master-0 kubenswrapper[16352]: I0307 21:42:33.186800 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.187228 master-0 kubenswrapper[16352]: I0307 21:42:33.187162 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.187228 master-0 kubenswrapper[16352]: I0307 21:42:33.187221 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.187341 master-0 kubenswrapper[16352]: I0307 21:42:33.187251 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-combined-ca-bundle\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.187341 master-0 kubenswrapper[16352]: I0307 21:42:33.187301 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-config\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.187341 master-0 kubenswrapper[16352]: I0307 21:42:33.187321 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.187446 master-0 kubenswrapper[16352]: I0307 21:42:33.187352 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwhf2\" (UniqueName: \"kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.187446 master-0 kubenswrapper[16352]: I0307 21:42:33.187418 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j8pth\" (UniqueName: \"kubernetes.io/projected/55b7e31a-1da5-4528-b904-db7de86e1f26-kube-api-access-j8pth\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.187506 master-0 kubenswrapper[16352]: I0307 21:42:33.187448 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.187771 master-0 kubenswrapper[16352]: I0307 21:42:33.187541 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.187771 master-0 kubenswrapper[16352]: I0307 21:42:33.187601 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj4x6\" (UniqueName: \"kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.190234 master-0 kubenswrapper[16352]: I0307 21:42:33.190130 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.205339 master-0 kubenswrapper[16352]: I0307 21:42:33.204611 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-combined-ca-bundle\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.234748 master-0 kubenswrapper[16352]: I0307 21:42:33.223101 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwhf2\" (UniqueName: \"kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2\") pod \"ironic-inspector-d904-account-create-update-tc485\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.234748 master-0 kubenswrapper[16352]: I0307 21:42:33.226416 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8pth\" (UniqueName: \"kubernetes.io/projected/55b7e31a-1da5-4528-b904-db7de86e1f26-kube-api-access-j8pth\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.240053 master-0 kubenswrapper[16352]: I0307 21:42:33.239987 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:33.240158 master-0 kubenswrapper[16352]: I0307 21:42:33.240069 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/55b7e31a-1da5-4528-b904-db7de86e1f26-config\") pod \"ironic-neutron-agent-89874fdc8-kjtzj\" (UID: \"55b7e31a-1da5-4528-b904-db7de86e1f26\") " pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.256194 master-0 kubenswrapper[16352]: I0307 21:42:33.253770 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:33.297182 master-0 kubenswrapper[16352]: I0307 21:42:33.297111 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.298600 master-0 kubenswrapper[16352]: I0307 21:42:33.298579 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.298789 master-0 kubenswrapper[16352]: I0307 21:42:33.298773 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nj4x6\" (UniqueName: \"kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.298940 master-0 kubenswrapper[16352]: I0307 21:42:33.298923 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ffg\" (UniqueName: \"kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.299051 master-0 kubenswrapper[16352]: I0307 21:42:33.299033 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.299193 master-0 kubenswrapper[16352]: I0307 21:42:33.299170 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.299319 master-0 kubenswrapper[16352]: I0307 21:42:33.299304 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.299431 master-0 kubenswrapper[16352]: I0307 21:42:33.299414 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.299528 master-0 kubenswrapper[16352]: I0307 21:42:33.299513 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.299663 master-0 kubenswrapper[16352]: I0307 21:42:33.299649 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.299925 master-0 kubenswrapper[16352]: I0307 21:42:33.299876 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.300029 master-0 kubenswrapper[16352]: I0307 21:42:33.300012 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.300146 master-0 kubenswrapper[16352]: I0307 21:42:33.300130 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.300299 master-0 kubenswrapper[16352]: I0307 21:42:33.300284 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.300723 master-0 kubenswrapper[16352]: I0307 21:42:33.300668 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.301859 master-0 kubenswrapper[16352]: I0307 21:42:33.301827 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.302520 master-0 kubenswrapper[16352]: I0307 21:42:33.302480 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.303026 master-0 kubenswrapper[16352]: I0307 21:42:33.302961 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.307128 master-0 kubenswrapper[16352]: I0307 21:42:33.307090 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.332314 master-0 kubenswrapper[16352]: I0307 21:42:33.332007 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj4x6\" (UniqueName: \"kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6\") pod \"dnsmasq-dns-699fc4cfdf-cmxnl\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.403144 master-0 kubenswrapper[16352]: I0307 21:42:33.403075 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.403322 master-0 kubenswrapper[16352]: I0307 21:42:33.403161 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.403322 master-0 kubenswrapper[16352]: I0307 21:42:33.403263 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.403322 master-0 kubenswrapper[16352]: I0307 21:42:33.403316 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ffg\" (UniqueName: \"kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.403344 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.403982 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.404046 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.404080 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.404271 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.404849 master-0 kubenswrapper[16352]: I0307 21:42:33.404518 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.408630 master-0 kubenswrapper[16352]: I0307 21:42:33.408538 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.408741 master-0 kubenswrapper[16352]: I0307 21:42:33.408703 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.409086 master-0 kubenswrapper[16352]: I0307 21:42:33.409052 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.409216 master-0 kubenswrapper[16352]: I0307 21:42:33.409169 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.415332 master-0 kubenswrapper[16352]: I0307 21:42:33.411120 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.430164 master-0 kubenswrapper[16352]: I0307 21:42:33.430102 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ffg\" (UniqueName: \"kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg\") pod \"ironic-f97759bbc-nbv8w\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.476959 master-0 kubenswrapper[16352]: I0307 21:42:33.476902 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:33.481785 master-0 kubenswrapper[16352]: I0307 21:42:33.481732 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:33.507485 master-0 kubenswrapper[16352]: I0307 21:42:33.507338 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:33.662829 master-0 kubenswrapper[16352]: I0307 21:42:33.662262 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:33.878605 master-0 kubenswrapper[16352]: I0307 21:42:33.877817 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:34.027984 master-0 kubenswrapper[16352]: I0307 21:42:34.027871 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.028300 master-0 kubenswrapper[16352]: I0307 21:42:34.028073 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.028300 master-0 kubenswrapper[16352]: I0307 21:42:34.028149 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.028454 master-0 kubenswrapper[16352]: I0307 21:42:34.028423 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.036942 master-0 kubenswrapper[16352]: I0307 21:42:34.029588 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.036942 master-0 kubenswrapper[16352]: I0307 21:42:34.030577 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b78v6\" (UniqueName: \"kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6\") pod \"c6ec644f-79e0-428a-a260-c2dde4320020\" (UID: \"c6ec644f-79e0-428a-a260-c2dde4320020\") " Mar 07 21:42:34.076880 master-0 kubenswrapper[16352]: I0307 21:42:34.070049 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6" (OuterVolumeSpecName: "kube-api-access-b78v6") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "kube-api-access-b78v6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:34.138482 master-0 kubenswrapper[16352]: I0307 21:42:34.138397 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b78v6\" (UniqueName: \"kubernetes.io/projected/c6ec644f-79e0-428a-a260-c2dde4320020-kube-api-access-b78v6\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.186178 master-0 kubenswrapper[16352]: I0307 21:42:34.186116 16352 generic.go:334] "Generic (PLEG): container finished" podID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerID="df24f3966ee9c06e2387b5a4896a043900b48426a93d9c0a9c63c6527a518b8b" exitCode=0 Mar 07 21:42:34.186796 master-0 kubenswrapper[16352]: I0307 21:42:34.186198 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerDied","Data":"df24f3966ee9c06e2387b5a4896a043900b48426a93d9c0a9c63c6527a518b8b"} Mar 07 21:42:34.191643 master-0 kubenswrapper[16352]: I0307 21:42:34.191582 16352 generic.go:334] "Generic (PLEG): container finished" podID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerID="be583d403489b6e6d7035f5a522e0cb98d638b8ae91cbbd533cf866161a60b68" exitCode=0 Mar 07 21:42:34.191786 master-0 kubenswrapper[16352]: I0307 21:42:34.191662 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerDied","Data":"be583d403489b6e6d7035f5a522e0cb98d638b8ae91cbbd533cf866161a60b68"} Mar 07 21:42:34.191786 master-0 kubenswrapper[16352]: I0307 21:42:34.191709 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"c3666942-0515-42ad-aa2a-20be90d7bc83","Type":"ContainerDied","Data":"758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a"} Mar 07 21:42:34.191786 master-0 kubenswrapper[16352]: I0307 21:42:34.191726 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758dddbb0f308bd2811b7f22ddbd1a38eb6470e67a6f7064833e11147e57e00a" Mar 07 21:42:34.193695 master-0 kubenswrapper[16352]: I0307 21:42:34.193587 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:34.195253 master-0 kubenswrapper[16352]: I0307 21:42:34.195183 16352 generic.go:334] "Generic (PLEG): container finished" podID="c6ec644f-79e0-428a-a260-c2dde4320020" containerID="dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63" exitCode=0 Mar 07 21:42:34.195309 master-0 kubenswrapper[16352]: I0307 21:42:34.195257 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" event={"ID":"c6ec644f-79e0-428a-a260-c2dde4320020","Type":"ContainerDied","Data":"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63"} Mar 07 21:42:34.195350 master-0 kubenswrapper[16352]: I0307 21:42:34.195306 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" event={"ID":"c6ec644f-79e0-428a-a260-c2dde4320020","Type":"ContainerDied","Data":"61cac17473ccbb3e78d95ba13f2f4f687e3bd74dce5b3a8a49fe3f1c291ce705"} Mar 07 21:42:34.195350 master-0 kubenswrapper[16352]: I0307 21:42:34.195328 16352 scope.go:117] "RemoveContainer" containerID="dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63" Mar 07 21:42:34.195577 master-0 kubenswrapper[16352]: I0307 21:42:34.195541 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7fb78888f7-pwtc8" Mar 07 21:42:34.223669 master-0 kubenswrapper[16352]: I0307 21:42:34.223550 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:34.230555 master-0 kubenswrapper[16352]: I0307 21:42:34.230343 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:34.255327 master-0 kubenswrapper[16352]: I0307 21:42:34.255222 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.255327 master-0 kubenswrapper[16352]: I0307 21:42:34.255323 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.255327 master-0 kubenswrapper[16352]: I0307 21:42:34.255337 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.270497 master-0 kubenswrapper[16352]: I0307 21:42:34.270432 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:34.275021 master-0 kubenswrapper[16352]: I0307 21:42:34.274955 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config" (OuterVolumeSpecName: "config") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:34.378449 master-0 kubenswrapper[16352]: I0307 21:42:34.378268 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.378449 master-0 kubenswrapper[16352]: I0307 21:42:34.378385 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.378795 master-0 kubenswrapper[16352]: I0307 21:42:34.378720 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.378970 master-0 kubenswrapper[16352]: I0307 21:42:34.378935 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379013 master-0 kubenswrapper[16352]: I0307 21:42:34.378998 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379047 master-0 kubenswrapper[16352]: I0307 21:42:34.379027 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379086 master-0 kubenswrapper[16352]: I0307 21:42:34.379071 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379146 master-0 kubenswrapper[16352]: I0307 21:42:34.379126 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2h5g2\" (UniqueName: \"kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379193 master-0 kubenswrapper[16352]: I0307 21:42:34.379158 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379263 master-0 kubenswrapper[16352]: I0307 21:42:34.379235 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379378 master-0 kubenswrapper[16352]: I0307 21:42:34.379350 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379416 master-0 kubenswrapper[16352]: I0307 21:42:34.379383 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379484 master-0 kubenswrapper[16352]: I0307 21:42:34.379464 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379525 master-0 kubenswrapper[16352]: I0307 21:42:34.379495 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.379573 master-0 kubenswrapper[16352]: I0307 21:42:34.379554 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys\") pod \"c3666942-0515-42ad-aa2a-20be90d7bc83\" (UID: \"c3666942-0515-42ad-aa2a-20be90d7bc83\") " Mar 07 21:42:34.381574 master-0 kubenswrapper[16352]: I0307 21:42:34.381527 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.383331 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386338 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386436 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386845 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts" (OuterVolumeSpecName: "scripts") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386893 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386928 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386951 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.386982 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev" (OuterVolumeSpecName: "dev") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.387362 master-0 kubenswrapper[16352]: I0307 21:42:34.387004 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys" (OuterVolumeSpecName: "sys") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.388414 master-0 kubenswrapper[16352]: I0307 21:42:34.388328 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.388484 master-0 kubenswrapper[16352]: I0307 21:42:34.388449 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run" (OuterVolumeSpecName: "run") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:34.399298 master-0 kubenswrapper[16352]: I0307 21:42:34.398484 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2" (OuterVolumeSpecName: "kube-api-access-2h5g2") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "kube-api-access-2h5g2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:34.400514 master-0 kubenswrapper[16352]: I0307 21:42:34.400411 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:34.455820 master-0 kubenswrapper[16352]: I0307 21:42:34.455561 16352 scope.go:117] "RemoveContainer" containerID="a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524001 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524068 16352 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524095 16352 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524108 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2h5g2\" (UniqueName: \"kubernetes.io/projected/c3666942-0515-42ad-aa2a-20be90d7bc83-kube-api-access-2h5g2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524120 16352 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524132 16352 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524145 16352 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524155 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524165 16352 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524174 16352 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-dev\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524185 16352 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-sys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524197 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.524210 16352 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3666942-0515-42ad-aa2a-20be90d7bc83-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.533840 master-0 kubenswrapper[16352]: I0307 21:42:34.530163 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c6ec644f-79e0-428a-a260-c2dde4320020" (UID: "c6ec644f-79e0-428a-a260-c2dde4320020"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:34.559214 master-0 kubenswrapper[16352]: I0307 21:42:34.554035 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:34.611893 master-0 kubenswrapper[16352]: I0307 21:42:34.609992 16352 scope.go:117] "RemoveContainer" containerID="dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63" Mar 07 21:42:34.611893 master-0 kubenswrapper[16352]: E0307 21:42:34.611432 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63\": container with ID starting with dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63 not found: ID does not exist" containerID="dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63" Mar 07 21:42:34.611893 master-0 kubenswrapper[16352]: I0307 21:42:34.611476 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63"} err="failed to get container status \"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63\": rpc error: code = NotFound desc = could not find container \"dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63\": container with ID starting with dd888ebb3ed4e69d0e5c5986a9da8f100f71f960a2e7c40734c860d0c14eab63 not found: ID does not exist" Mar 07 21:42:34.611893 master-0 kubenswrapper[16352]: I0307 21:42:34.611505 16352 scope.go:117] "RemoveContainer" containerID="a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55" Mar 07 21:42:34.620170 master-0 kubenswrapper[16352]: E0307 21:42:34.620095 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55\": container with ID starting with a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55 not found: ID does not exist" containerID="a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55" Mar 07 21:42:34.620170 master-0 kubenswrapper[16352]: I0307 21:42:34.620149 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55"} err="failed to get container status \"a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55\": rpc error: code = NotFound desc = could not find container \"a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55\": container with ID starting with a29cbd0108126c52f100b05f2229ed68a8aeaad1fe6b57b7e52e020eacc5ed55 not found: ID does not exist" Mar 07 21:42:34.636871 master-0 kubenswrapper[16352]: I0307 21:42:34.636426 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.637166 master-0 kubenswrapper[16352]: I0307 21:42:34.637149 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c6ec644f-79e0-428a-a260-c2dde4320020-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.648640 master-0 kubenswrapper[16352]: I0307 21:42:34.648581 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-create-sdzv8"] Mar 07 21:42:34.696961 master-0 kubenswrapper[16352]: I0307 21:42:34.696838 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data" (OuterVolumeSpecName: "config-data") pod "c3666942-0515-42ad-aa2a-20be90d7bc83" (UID: "c3666942-0515-42ad-aa2a-20be90d7bc83"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:34.780476 master-0 kubenswrapper[16352]: I0307 21:42:34.780396 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3666942-0515-42ad-aa2a-20be90d7bc83-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:34.783985 master-0 kubenswrapper[16352]: I0307 21:42:34.782681 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:42:34.843146 master-0 kubenswrapper[16352]: I0307 21:42:34.822606 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-neutron-agent-89874fdc8-kjtzj"] Mar 07 21:42:34.870409 master-0 kubenswrapper[16352]: I0307 21:42:34.870347 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-conductor-0"] Mar 07 21:42:34.870981 master-0 kubenswrapper[16352]: E0307 21:42:34.870956 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="cinder-volume" Mar 07 21:42:34.870981 master-0 kubenswrapper[16352]: I0307 21:42:34.870978 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="cinder-volume" Mar 07 21:42:34.871059 master-0 kubenswrapper[16352]: E0307 21:42:34.870997 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="probe" Mar 07 21:42:34.871059 master-0 kubenswrapper[16352]: I0307 21:42:34.871005 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="probe" Mar 07 21:42:34.871059 master-0 kubenswrapper[16352]: E0307 21:42:34.871055 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="init" Mar 07 21:42:34.871154 master-0 kubenswrapper[16352]: I0307 21:42:34.871062 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="init" Mar 07 21:42:34.871154 master-0 kubenswrapper[16352]: E0307 21:42:34.871077 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="dnsmasq-dns" Mar 07 21:42:34.871154 master-0 kubenswrapper[16352]: I0307 21:42:34.871083 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="dnsmasq-dns" Mar 07 21:42:34.871347 master-0 kubenswrapper[16352]: I0307 21:42:34.871322 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="probe" Mar 07 21:42:34.871413 master-0 kubenswrapper[16352]: I0307 21:42:34.871369 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" containerName="dnsmasq-dns" Mar 07 21:42:34.871413 master-0 kubenswrapper[16352]: I0307 21:42:34.871392 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" containerName="cinder-volume" Mar 07 21:42:34.878414 master-0 kubenswrapper[16352]: I0307 21:42:34.875591 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 07 21:42:34.880250 master-0 kubenswrapper[16352]: I0307 21:42:34.879979 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-scripts" Mar 07 21:42:34.880250 master-0 kubenswrapper[16352]: I0307 21:42:34.880162 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-conductor-config-data" Mar 07 21:42:34.893638 master-0 kubenswrapper[16352]: I0307 21:42:34.893518 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 07 21:42:34.917158 master-0 kubenswrapper[16352]: I0307 21:42:34.910372 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:34.930232 master-0 kubenswrapper[16352]: I0307 21:42:34.929216 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7fb78888f7-pwtc8"] Mar 07 21:42:34.988517 master-0 kubenswrapper[16352]: I0307 21:42:34.988378 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.988525 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-f8d69209-b4b7-4f2f-ae57-1537b9cc303f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a9636256-c038-462a-a0bf-2f9bbedc45c3\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.988688 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.988849 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-scripts\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.988881 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpvhh\" (UniqueName: \"kubernetes.io/projected/121505c3-5091-4945-a0aa-ec97b5f45ce5-kube-api-access-jpvhh\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.988999 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/121505c3-5091-4945-a0aa-ec97b5f45ce5-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989094 master-0 kubenswrapper[16352]: I0307 21:42:34.989035 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:34.989772 master-0 kubenswrapper[16352]: I0307 21:42:34.989688 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092452 master-0 kubenswrapper[16352]: I0307 21:42:35.092349 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/121505c3-5091-4945-a0aa-ec97b5f45ce5-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092733 master-0 kubenswrapper[16352]: I0307 21:42:35.092475 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092733 master-0 kubenswrapper[16352]: I0307 21:42:35.092545 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092733 master-0 kubenswrapper[16352]: I0307 21:42:35.092627 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092733 master-0 kubenswrapper[16352]: I0307 21:42:35.092734 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-f8d69209-b4b7-4f2f-ae57-1537b9cc303f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a9636256-c038-462a-a0bf-2f9bbedc45c3\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092929 master-0 kubenswrapper[16352]: I0307 21:42:35.092869 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.092929 master-0 kubenswrapper[16352]: I0307 21:42:35.092909 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-scripts\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.093011 master-0 kubenswrapper[16352]: I0307 21:42:35.092939 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpvhh\" (UniqueName: \"kubernetes.io/projected/121505c3-5091-4945-a0aa-ec97b5f45ce5-kube-api-access-jpvhh\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.093063 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-merged\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.099707 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/121505c3-5091-4945-a0aa-ec97b5f45ce5-etc-podinfo\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.100132 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-combined-ca-bundle\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.100257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data-custom\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.100943 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:42:35.106602 master-0 kubenswrapper[16352]: I0307 21:42:35.101017 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-f8d69209-b4b7-4f2f-ae57-1537b9cc303f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a9636256-c038-462a-a0bf-2f9bbedc45c3\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/8adada68a6c5a3a2cfab09bd143204c75015844ea4f74f1c51b2381de8c45f50/globalmount\"" pod="openstack/ironic-conductor-0" Mar 07 21:42:35.111891 master-0 kubenswrapper[16352]: I0307 21:42:35.107863 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-scripts\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.116077 master-0 kubenswrapper[16352]: I0307 21:42:35.115952 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/121505c3-5091-4945-a0aa-ec97b5f45ce5-config-data\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.134863 master-0 kubenswrapper[16352]: I0307 21:42:35.134797 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpvhh\" (UniqueName: \"kubernetes.io/projected/121505c3-5091-4945-a0aa-ec97b5f45ce5-kube-api-access-jpvhh\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:35.215423 master-0 kubenswrapper[16352]: I0307 21:42:35.213047 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c6ec644f-79e0-428a-a260-c2dde4320020" path="/var/lib/kubelet/pods/c6ec644f-79e0-428a-a260-c2dde4320020/volumes" Mar 07 21:42:35.224522 master-0 kubenswrapper[16352]: I0307 21:42:35.224465 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" event={"ID":"e33d7a05-baac-460b-9f72-133d1f7c7b07","Type":"ContainerStarted","Data":"7bcbc5d61802b8b22d410678b665a9a2ce3134c36d214615bcf30d7449e860c9"} Mar 07 21:42:35.226021 master-0 kubenswrapper[16352]: I0307 21:42:35.225978 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerStarted","Data":"5835ad4e2447bbc2b008668019eeafb5ce84fdf4b350f7869e2e5f9f070169e1"} Mar 07 21:42:35.228294 master-0 kubenswrapper[16352]: I0307 21:42:35.228254 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-sdzv8" event={"ID":"ac2528cb-3b05-45ee-adf7-67e32faaab12","Type":"ContainerStarted","Data":"7bd06b9455c7b486007744004863bc8005931459c8f25e88ae3f5497ac588905"} Mar 07 21:42:35.228294 master-0 kubenswrapper[16352]: I0307 21:42:35.228286 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-sdzv8" event={"ID":"ac2528cb-3b05-45ee-adf7-67e32faaab12","Type":"ContainerStarted","Data":"68e73d2ab90ab14c2dee7b36e0f41c1a81896f3922f496dd68207f0a2fb2824f"} Mar 07 21:42:35.231186 master-0 kubenswrapper[16352]: I0307 21:42:35.231136 16352 generic.go:334] "Generic (PLEG): container finished" podID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerID="07e1463666ed4bc8941e56a2fb227bab60740dff683bc71760cfd12a657ad0ab" exitCode=0 Mar 07 21:42:35.231243 master-0 kubenswrapper[16352]: I0307 21:42:35.231201 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerDied","Data":"07e1463666ed4bc8941e56a2fb227bab60740dff683bc71760cfd12a657ad0ab"} Mar 07 21:42:35.234064 master-0 kubenswrapper[16352]: I0307 21:42:35.233948 16352 generic.go:334] "Generic (PLEG): container finished" podID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerID="161ae14484b48f26c24528d92a92ab3ea4cd07507499acb11ec2f9706bc4d932" exitCode=0 Mar 07 21:42:35.234064 master-0 kubenswrapper[16352]: I0307 21:42:35.234047 16352 generic.go:334] "Generic (PLEG): container finished" podID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerID="b822ddef479a10d11d2bfbac6c105b64ecc3912b656c9b8d5c04d08bcf7ebb01" exitCode=0 Mar 07 21:42:35.234163 master-0 kubenswrapper[16352]: I0307 21:42:35.234063 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerDied","Data":"161ae14484b48f26c24528d92a92ab3ea4cd07507499acb11ec2f9706bc4d932"} Mar 07 21:42:35.234163 master-0 kubenswrapper[16352]: I0307 21:42:35.234142 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerDied","Data":"b822ddef479a10d11d2bfbac6c105b64ecc3912b656c9b8d5c04d08bcf7ebb01"} Mar 07 21:42:35.235593 master-0 kubenswrapper[16352]: I0307 21:42:35.235538 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.317691 master-0 kubenswrapper[16352]: I0307 21:42:35.317586 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-create-sdzv8" podStartSLOduration=3.3175617 podStartE2EDuration="3.3175617s" podCreationTimestamp="2026-03-07 21:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:35.277391725 +0000 UTC m=+1478.348096794" watchObservedRunningTime="2026-03-07 21:42:35.3175617 +0000 UTC m=+1478.388266759" Mar 07 21:42:35.374053 master-0 kubenswrapper[16352]: I0307 21:42:35.373980 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:35.401880 master-0 kubenswrapper[16352]: I0307 21:42:35.401750 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-d904-account-create-update-tc485"] Mar 07 21:42:35.410533 master-0 kubenswrapper[16352]: I0307 21:42:35.408801 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:35.429121 master-0 kubenswrapper[16352]: W0307 21:42:35.428809 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod77f0edd2_211c_423c_b49f_d2c69df20f23.slice/crio-0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0 WatchSource:0}: Error finding container 0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0: Status 404 returned error can't find the container with id 0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0 Mar 07 21:42:35.429121 master-0 kubenswrapper[16352]: I0307 21:42:35.429024 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:35.433296 master-0 kubenswrapper[16352]: W0307 21:42:35.433216 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e3ae5f4_4a11_4c09_9831_effc4a588f9b.slice/crio-05f0ec7f7abba55d66b2d3c188ce5adf02d933efe187795c57a799255b5b2432 WatchSource:0}: Error finding container 05f0ec7f7abba55d66b2d3c188ce5adf02d933efe187795c57a799255b5b2432: Status 404 returned error can't find the container with id 05f0ec7f7abba55d66b2d3c188ce5adf02d933efe187795c57a799255b5b2432 Mar 07 21:42:35.471773 master-0 kubenswrapper[16352]: I0307 21:42:35.471586 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:35.477031 master-0 kubenswrapper[16352]: I0307 21:42:35.476961 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.483038 master-0 kubenswrapper[16352]: I0307 21:42:35.481838 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-volume-lvm-iscsi-config-data" Mar 07 21:42:35.509738 master-0 kubenswrapper[16352]: I0307 21:42:35.509647 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:35.565768 master-0 kubenswrapper[16352]: I0307 21:42:35.544595 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: E0307 21:42:35.617699 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode33d7a05_baac_460b_9f72_133d1f7c7b07.slice/crio-20c63d6d67f75faa42d31573f6b657b524e1a39f33655f7269b5e7d4ac65b806.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3666942_0515_42ad_aa2a_20be90d7bc83.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode33d7a05_baac_460b_9f72_133d1f7c7b07.slice/crio-conmon-20c63d6d67f75faa42d31573f6b657b524e1a39f33655f7269b5e7d4ac65b806.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.621585 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.621813 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.621843 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ftr6q\" (UniqueName: \"kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.621913 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.621959 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622065 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data\") pod \"156d22b1-6d30-42b4-8cff-e5a563d2861d\" (UID: \"156d22b1-6d30-42b4-8cff-e5a563d2861d\") " Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622513 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622570 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622650 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622698 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.622751 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623568 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623605 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623659 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623710 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xmtj\" (UniqueName: \"kubernetes.io/projected/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-kube-api-access-9xmtj\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623833 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623893 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.623983 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.624018 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.624338 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.624910 master-0 kubenswrapper[16352]: I0307 21:42:35.624398 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.629034 master-0 kubenswrapper[16352]: I0307 21:42:35.626785 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:35.651896 master-0 kubenswrapper[16352]: I0307 21:42:35.633272 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:35.653085 master-0 kubenswrapper[16352]: I0307 21:42:35.652962 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q" (OuterVolumeSpecName: "kube-api-access-ftr6q") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "kube-api-access-ftr6q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:35.664726 master-0 kubenswrapper[16352]: I0307 21:42:35.656855 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts" (OuterVolumeSpecName: "scripts") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731121 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731307 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-machine-id\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731422 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731631 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-sys\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731678 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731791 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731858 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731902 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731973 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-lib-modules\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.731979 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732008 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-iscsi\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732074 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-brick\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732100 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-lib-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732123 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-var-locks-cinder\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732198 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732227 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xmtj\" (UniqueName: \"kubernetes.io/projected/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-kube-api-access-9xmtj\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732466 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732542 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732651 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.732698 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733008 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733050 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733174 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733190 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ftr6q\" (UniqueName: \"kubernetes.io/projected/156d22b1-6d30-42b4-8cff-e5a563d2861d-kube-api-access-ftr6q\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733201 16352 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/156d22b1-6d30-42b4-8cff-e5a563d2861d-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733211 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733246 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-dev\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.733786 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-etc-nvme\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.734877 master-0 kubenswrapper[16352]: I0307 21:42:35.734015 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-run\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.742086 master-0 kubenswrapper[16352]: I0307 21:42:35.737550 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-scripts\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.742086 master-0 kubenswrapper[16352]: I0307 21:42:35.738632 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.742086 master-0 kubenswrapper[16352]: I0307 21:42:35.739267 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-combined-ca-bundle\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.744095 master-0 kubenswrapper[16352]: I0307 21:42:35.744007 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-config-data-custom\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.749737 master-0 kubenswrapper[16352]: I0307 21:42:35.749572 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xmtj\" (UniqueName: \"kubernetes.io/projected/3f8fb2d4-ec08-484a-afcc-c09964dc9c8f-kube-api-access-9xmtj\") pod \"cinder-86971-volume-lvm-iscsi-0\" (UID: \"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f\") " pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.756677 master-0 kubenswrapper[16352]: I0307 21:42:35.756541 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:35.799900 master-0 kubenswrapper[16352]: I0307 21:42:35.799721 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:35.841522 master-0 kubenswrapper[16352]: I0307 21:42:35.835966 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.841522 master-0 kubenswrapper[16352]: I0307 21:42:35.837497 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data" (OuterVolumeSpecName: "config-data") pod "156d22b1-6d30-42b4-8cff-e5a563d2861d" (UID: "156d22b1-6d30-42b4-8cff-e5a563d2861d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:35.957263 master-0 kubenswrapper[16352]: I0307 21:42:35.957188 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/156d22b1-6d30-42b4-8cff-e5a563d2861d-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:35.981154 master-0 kubenswrapper[16352]: I0307 21:42:35.980357 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.104746 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.104810 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.104890 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.104943 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev" (OuterVolumeSpecName: "dev") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.104994 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105022 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105029 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105060 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105065 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105134 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys" (OuterVolumeSpecName: "sys") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105181 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105184 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6j8h\" (UniqueName: \"kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.105725 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder" (OuterVolumeSpecName: "var-lib-cinder") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "var-lib-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106825 16352 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-dev\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106843 16352 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-iscsi\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106859 16352 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-brick\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106873 16352 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-machine-id\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106883 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-lib-cinder\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.114750 master-0 kubenswrapper[16352]: I0307 21:42:36.106892 16352 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-sys\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.128761 master-0 kubenswrapper[16352]: I0307 21:42:36.121439 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h" (OuterVolumeSpecName: "kube-api-access-k6j8h") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "kube-api-access-k6j8h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209213 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209309 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209384 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209438 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209460 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209515 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209539 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.209602 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data\") pod \"801bdc03-e76c-4dfb-a97e-1327be4a522a\" (UID: \"801bdc03-e76c-4dfb-a97e-1327be4a522a\") " Mar 07 21:42:36.213733 master-0 kubenswrapper[16352]: I0307 21:42:36.211857 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.214211 master-0 kubenswrapper[16352]: I0307 21:42:36.214025 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6j8h\" (UniqueName: \"kubernetes.io/projected/801bdc03-e76c-4dfb-a97e-1327be4a522a-kube-api-access-k6j8h\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.217731 master-0 kubenswrapper[16352]: I0307 21:42:36.215202 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run" (OuterVolumeSpecName: "run") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.217731 master-0 kubenswrapper[16352]: I0307 21:42:36.215250 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder" (OuterVolumeSpecName: "var-locks-cinder") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "var-locks-cinder". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.217731 master-0 kubenswrapper[16352]: I0307 21:42:36.215278 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 07 21:42:36.231733 master-0 kubenswrapper[16352]: I0307 21:42:36.222076 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:36.304831 master-0 kubenswrapper[16352]: I0307 21:42:36.300026 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts" (OuterVolumeSpecName: "scripts") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:36.316718 master-0 kubenswrapper[16352]: I0307 21:42:36.315261 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"801bdc03-e76c-4dfb-a97e-1327be4a522a","Type":"ContainerDied","Data":"6c16c6d136eddd13f7f322d69ffa4bf4d9e645a845caded0ab71b591757e630b"} Mar 07 21:42:36.316718 master-0 kubenswrapper[16352]: I0307 21:42:36.315362 16352 scope.go:117] "RemoveContainer" containerID="161ae14484b48f26c24528d92a92ab3ea4cd07507499acb11ec2f9706bc4d932" Mar 07 21:42:36.316718 master-0 kubenswrapper[16352]: I0307 21:42:36.315584 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342253 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342333 16352 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342351 16352 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-lib-modules\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342366 16352 reconciler_common.go:293] "Volume detached for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-var-locks-cinder\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342382 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.344767 master-0 kubenswrapper[16352]: I0307 21:42:36.342395 16352 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/801bdc03-e76c-4dfb-a97e-1327be4a522a-etc-nvme\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.350379 master-0 kubenswrapper[16352]: I0307 21:42:36.349605 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerStarted","Data":"05f0ec7f7abba55d66b2d3c188ce5adf02d933efe187795c57a799255b5b2432"} Mar 07 21:42:36.362480 master-0 kubenswrapper[16352]: I0307 21:42:36.356000 16352 generic.go:334] "Generic (PLEG): container finished" podID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerID="20c63d6d67f75faa42d31573f6b657b524e1a39f33655f7269b5e7d4ac65b806" exitCode=0 Mar 07 21:42:36.362480 master-0 kubenswrapper[16352]: I0307 21:42:36.356119 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" event={"ID":"e33d7a05-baac-460b-9f72-133d1f7c7b07","Type":"ContainerDied","Data":"20c63d6d67f75faa42d31573f6b657b524e1a39f33655f7269b5e7d4ac65b806"} Mar 07 21:42:36.362480 master-0 kubenswrapper[16352]: I0307 21:42:36.362316 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-d904-account-create-update-tc485" event={"ID":"77f0edd2-211c-423c-b49f-d2c69df20f23","Type":"ContainerStarted","Data":"0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0"} Mar 07 21:42:36.388224 master-0 kubenswrapper[16352]: I0307 21:42:36.382307 16352 generic.go:334] "Generic (PLEG): container finished" podID="ac2528cb-3b05-45ee-adf7-67e32faaab12" containerID="7bd06b9455c7b486007744004863bc8005931459c8f25e88ae3f5497ac588905" exitCode=0 Mar 07 21:42:36.388224 master-0 kubenswrapper[16352]: I0307 21:42:36.382479 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-sdzv8" event={"ID":"ac2528cb-3b05-45ee-adf7-67e32faaab12","Type":"ContainerDied","Data":"7bd06b9455c7b486007744004863bc8005931459c8f25e88ae3f5497ac588905"} Mar 07 21:42:36.408973 master-0 kubenswrapper[16352]: I0307 21:42:36.399289 16352 scope.go:117] "RemoveContainer" containerID="b822ddef479a10d11d2bfbac6c105b64ecc3912b656c9b8d5c04d08bcf7ebb01" Mar 07 21:42:36.408973 master-0 kubenswrapper[16352]: I0307 21:42:36.405942 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"156d22b1-6d30-42b4-8cff-e5a563d2861d","Type":"ContainerDied","Data":"90001a728a47c32c0357761d9ffbfda8c8b4ade72dab2dd1fa37eb7eef6323ad"} Mar 07 21:42:36.408973 master-0 kubenswrapper[16352]: I0307 21:42:36.406429 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.484564 master-0 kubenswrapper[16352]: I0307 21:42:36.484501 16352 scope.go:117] "RemoveContainer" containerID="df24f3966ee9c06e2387b5a4896a043900b48426a93d9c0a9c63c6527a518b8b" Mar 07 21:42:36.494049 master-0 kubenswrapper[16352]: I0307 21:42:36.493644 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-volume-lvm-iscsi-0"] Mar 07 21:42:36.559527 master-0 kubenswrapper[16352]: I0307 21:42:36.559384 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:36.571407 master-0 kubenswrapper[16352]: I0307 21:42:36.571337 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.589821 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: E0307 21:42:36.590716 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="cinder-scheduler" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.590735 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="cinder-scheduler" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: E0307 21:42:36.590756 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.590763 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: E0307 21:42:36.590799 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="cinder-backup" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.590808 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="cinder-backup" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: E0307 21:42:36.590822 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.590831 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.591114 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="cinder-scheduler" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.591154 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.591202 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" containerName="cinder-backup" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.591217 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" containerName="probe" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.593007 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.596470 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-scheduler-config-data" Mar 07 21:42:36.602042 master-0 kubenswrapper[16352]: I0307 21:42:36.600897 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:36.608799 master-0 kubenswrapper[16352]: I0307 21:42:36.608598 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-f8d69209-b4b7-4f2f-ae57-1537b9cc303f\" (UniqueName: \"kubernetes.io/csi/topolvm.io^a9636256-c038-462a-a0bf-2f9bbedc45c3\") pod \"ironic-conductor-0\" (UID: \"121505c3-5091-4945-a0aa-ec97b5f45ce5\") " pod="openstack/ironic-conductor-0" Mar 07 21:42:36.620122 master-0 kubenswrapper[16352]: I0307 21:42:36.618868 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:36.639319 master-0 kubenswrapper[16352]: I0307 21:42:36.639097 16352 scope.go:117] "RemoveContainer" containerID="07e1463666ed4bc8941e56a2fb227bab60740dff683bc71760cfd12a657ad0ab" Mar 07 21:42:36.644859 master-0 kubenswrapper[16352]: I0307 21:42:36.644807 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data" (OuterVolumeSpecName: "config-data") pod "801bdc03-e76c-4dfb-a97e-1327be4a522a" (UID: "801bdc03-e76c-4dfb-a97e-1327be4a522a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:36.686434 master-0 kubenswrapper[16352]: I0307 21:42:36.685056 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.686434 master-0 kubenswrapper[16352]: I0307 21:42:36.685605 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.686434 master-0 kubenswrapper[16352]: I0307 21:42:36.685816 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.686713 master-0 kubenswrapper[16352]: I0307 21:42:36.686434 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0badc58c-c623-44ad-8a69-7df699628dba-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.686860 master-0 kubenswrapper[16352]: I0307 21:42:36.686763 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbftn\" (UniqueName: \"kubernetes.io/projected/0badc58c-c623-44ad-8a69-7df699628dba-kube-api-access-zbftn\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.689446 master-0 kubenswrapper[16352]: I0307 21:42:36.689381 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.689669 master-0 kubenswrapper[16352]: I0307 21:42:36.689636 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.689669 master-0 kubenswrapper[16352]: I0307 21:42:36.689659 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/801bdc03-e76c-4dfb-a97e-1327be4a522a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:36.706980 master-0 kubenswrapper[16352]: I0307 21:42:36.705830 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-conductor-0" Mar 07 21:42:36.794538 master-0 kubenswrapper[16352]: I0307 21:42:36.794417 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.797788 master-0 kubenswrapper[16352]: I0307 21:42:36.795497 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.797788 master-0 kubenswrapper[16352]: I0307 21:42:36.795748 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0badc58c-c623-44ad-8a69-7df699628dba-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.797788 master-0 kubenswrapper[16352]: I0307 21:42:36.795921 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbftn\" (UniqueName: \"kubernetes.io/projected/0badc58c-c623-44ad-8a69-7df699628dba-kube-api-access-zbftn\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.797788 master-0 kubenswrapper[16352]: I0307 21:42:36.796232 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.797788 master-0 kubenswrapper[16352]: I0307 21:42:36.796344 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.799504 master-0 kubenswrapper[16352]: I0307 21:42:36.797780 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/0badc58c-c623-44ad-8a69-7df699628dba-etc-machine-id\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.806431 master-0 kubenswrapper[16352]: I0307 21:42:36.806397 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-scripts\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.806577 master-0 kubenswrapper[16352]: I0307 21:42:36.806533 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data-custom\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.812082 master-0 kubenswrapper[16352]: I0307 21:42:36.811982 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-config-data\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.821517 master-0 kubenswrapper[16352]: I0307 21:42:36.821449 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0badc58c-c623-44ad-8a69-7df699628dba-combined-ca-bundle\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.822431 master-0 kubenswrapper[16352]: I0307 21:42:36.822372 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbftn\" (UniqueName: \"kubernetes.io/projected/0badc58c-c623-44ad-8a69-7df699628dba-kube-api-access-zbftn\") pod \"cinder-86971-scheduler-0\" (UID: \"0badc58c-c623-44ad-8a69-7df699628dba\") " pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.898941 master-0 kubenswrapper[16352]: I0307 21:42:36.898882 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:36.992715 master-0 kubenswrapper[16352]: I0307 21:42:36.989925 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:37.021571 master-0 kubenswrapper[16352]: I0307 21:42:37.021506 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:37.116995 master-0 kubenswrapper[16352]: I0307 21:42:37.112836 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:37.119162 master-0 kubenswrapper[16352]: I0307 21:42:37.118616 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.144631 master-0 kubenswrapper[16352]: I0307 21:42:37.136145 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-86971-backup-config-data" Mar 07 21:42:37.180326 master-0 kubenswrapper[16352]: I0307 21:42:37.179789 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:37.273520 master-0 kubenswrapper[16352]: I0307 21:42:37.262287 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="156d22b1-6d30-42b4-8cff-e5a563d2861d" path="/var/lib/kubelet/pods/156d22b1-6d30-42b4-8cff-e5a563d2861d/volumes" Mar 07 21:42:37.277739 master-0 kubenswrapper[16352]: I0307 21:42:37.277331 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.281731 master-0 kubenswrapper[16352]: I0307 21:42:37.278877 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.281731 master-0 kubenswrapper[16352]: I0307 21:42:37.279345 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.281731 master-0 kubenswrapper[16352]: I0307 21:42:37.280065 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.281731 master-0 kubenswrapper[16352]: I0307 21:42:37.281466 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-run\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.281731 master-0 kubenswrapper[16352]: I0307 21:42:37.281573 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.282056 master-0 kubenswrapper[16352]: I0307 21:42:37.281955 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.282056 master-0 kubenswrapper[16352]: I0307 21:42:37.282029 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b89tm\" (UniqueName: \"kubernetes.io/projected/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-kube-api-access-b89tm\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.282612 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.282677 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-sys\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.286159 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.286394 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-scripts\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.286458 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.286553 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-dev\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.288034 master-0 kubenswrapper[16352]: I0307 21:42:37.286593 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.296083 master-0 kubenswrapper[16352]: I0307 21:42:37.293624 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="801bdc03-e76c-4dfb-a97e-1327be4a522a" path="/var/lib/kubelet/pods/801bdc03-e76c-4dfb-a97e-1327be4a522a/volumes" Mar 07 21:42:37.296083 master-0 kubenswrapper[16352]: I0307 21:42:37.295484 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3666942-0515-42ad-aa2a-20be90d7bc83" path="/var/lib/kubelet/pods/c3666942-0515-42ad-aa2a-20be90d7bc83/volumes" Mar 07 21:42:37.297662 master-0 kubenswrapper[16352]: I0307 21:42:37.296948 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-6767bc4dd7-cp8fn"] Mar 07 21:42:37.320274 master-0 kubenswrapper[16352]: I0307 21:42:37.319932 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.332220 master-0 kubenswrapper[16352]: I0307 21:42:37.332158 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-internal-svc" Mar 07 21:42:37.332436 master-0 kubenswrapper[16352]: I0307 21:42:37.332353 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-public-svc" Mar 07 21:42:37.363906 master-0 kubenswrapper[16352]: I0307 21:42:37.361166 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6767bc4dd7-cp8fn"] Mar 07 21:42:37.390167 master-0 kubenswrapper[16352]: I0307 21:42:37.390091 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.390731 master-0 kubenswrapper[16352]: I0307 21:42:37.390707 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-scripts\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.391243 master-0 kubenswrapper[16352]: I0307 21:42:37.391170 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.391535 master-0 kubenswrapper[16352]: I0307 21:42:37.391511 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-dev\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392047 master-0 kubenswrapper[16352]: I0307 21:42:37.391950 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-brick\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392125 master-0 kubenswrapper[16352]: I0307 21:42:37.392093 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392304 master-0 kubenswrapper[16352]: I0307 21:42:37.392250 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392367 master-0 kubenswrapper[16352]: I0307 21:42:37.392339 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392490 master-0 kubenswrapper[16352]: I0307 21:42:37.392442 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392728 master-0 kubenswrapper[16352]: I0307 21:42:37.392621 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392728 master-0 kubenswrapper[16352]: I0307 21:42:37.392663 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-run\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.392872 master-0 kubenswrapper[16352]: I0307 21:42:37.392770 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.392945 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-lib-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393041 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-dev\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393044 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393103 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-cinder\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-var-locks-cinder\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393108 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-run\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393079 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-iscsi\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393163 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-nvme\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393190 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-etc-machine-id\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393203 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b89tm\" (UniqueName: \"kubernetes.io/projected/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-kube-api-access-b89tm\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393212 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-lib-modules\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393506 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393558 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-sys\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.394331 master-0 kubenswrapper[16352]: I0307 21:42:37.393772 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-sys\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.399068 master-0 kubenswrapper[16352]: I0307 21:42:37.399000 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.399264 master-0 kubenswrapper[16352]: I0307 21:42:37.399214 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-scripts\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.405516 master-0 kubenswrapper[16352]: I0307 21:42:37.405448 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-combined-ca-bundle\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.405594 master-0 kubenswrapper[16352]: I0307 21:42:37.405501 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-config-data-custom\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.426383 master-0 kubenswrapper[16352]: I0307 21:42:37.425416 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b89tm\" (UniqueName: \"kubernetes.io/projected/40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd-kube-api-access-b89tm\") pod \"cinder-86971-backup-0\" (UID: \"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd\") " pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.473918 master-0 kubenswrapper[16352]: I0307 21:42:37.472428 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-d904-account-create-update-tc485" event={"ID":"77f0edd2-211c-423c-b49f-d2c69df20f23","Type":"ContainerStarted","Data":"97a4dba59598ff3cd37a1a636d9fd3ae198b019c02d082f9d48e18bddc3419bf"} Mar 07 21:42:37.497180 master-0 kubenswrapper[16352]: I0307 21:42:37.497098 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-combined-ca-bundle\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497513 master-0 kubenswrapper[16352]: I0307 21:42:37.497221 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497513 master-0 kubenswrapper[16352]: I0307 21:42:37.497330 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-custom\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497513 master-0 kubenswrapper[16352]: I0307 21:42:37.497397 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-internal-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497513 master-0 kubenswrapper[16352]: I0307 21:42:37.497480 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tslt6\" (UniqueName: \"kubernetes.io/projected/0d0a7bb8-c118-4b85-aaed-0eee4090a321-kube-api-access-tslt6\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497657 master-0 kubenswrapper[16352]: I0307 21:42:37.497534 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-scripts\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497657 master-0 kubenswrapper[16352]: I0307 21:42:37.497587 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-merged\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497657 master-0 kubenswrapper[16352]: I0307 21:42:37.497616 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-public-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.497914 master-0 kubenswrapper[16352]: I0307 21:42:37.497880 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0d0a7bb8-c118-4b85-aaed-0eee4090a321-etc-podinfo\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.498473 master-0 kubenswrapper[16352]: I0307 21:42:37.498439 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-logs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.501339 master-0 kubenswrapper[16352]: I0307 21:42:37.501258 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f","Type":"ContainerStarted","Data":"76c6ee78c7b5f425e8d413fe23ec4fd4aa304930c339a6909b96317ae2f61a18"} Mar 07 21:42:37.556183 master-0 kubenswrapper[16352]: I0307 21:42:37.555950 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608465 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-combined-ca-bundle\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608557 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608619 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-custom\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608638 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-internal-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608683 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tslt6\" (UniqueName: \"kubernetes.io/projected/0d0a7bb8-c118-4b85-aaed-0eee4090a321-kube-api-access-tslt6\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608746 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-scripts\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608801 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-merged\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608824 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-public-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608907 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0d0a7bb8-c118-4b85-aaed-0eee4090a321-etc-podinfo\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.608961 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-logs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.609804 master-0 kubenswrapper[16352]: I0307 21:42:37.609469 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-logs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.613413 master-0 kubenswrapper[16352]: I0307 21:42:37.613355 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-merged\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.623417 master-0 kubenswrapper[16352]: I0307 21:42:37.623146 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-combined-ca-bundle\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.623417 master-0 kubenswrapper[16352]: I0307 21:42:37.623178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.639733 master-0 kubenswrapper[16352]: I0307 21:42:37.639388 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/0d0a7bb8-c118-4b85-aaed-0eee4090a321-etc-podinfo\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.641032 master-0 kubenswrapper[16352]: I0307 21:42:37.639900 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-public-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.641843 master-0 kubenswrapper[16352]: I0307 21:42:37.641731 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-config-data-custom\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.643570 master-0 kubenswrapper[16352]: I0307 21:42:37.643491 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-d904-account-create-update-tc485" podStartSLOduration=5.643468751 podStartE2EDuration="5.643468751s" podCreationTimestamp="2026-03-07 21:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:37.639441164 +0000 UTC m=+1480.710146223" watchObservedRunningTime="2026-03-07 21:42:37.643468751 +0000 UTC m=+1480.714173810" Mar 07 21:42:37.659463 master-0 kubenswrapper[16352]: I0307 21:42:37.656933 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-scripts\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.683255 master-0 kubenswrapper[16352]: I0307 21:42:37.682241 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:37.758297 master-0 kubenswrapper[16352]: I0307 21:42:37.758155 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:42:37.775849 master-0 kubenswrapper[16352]: I0307 21:42:37.775799 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/0d0a7bb8-c118-4b85-aaed-0eee4090a321-internal-tls-certs\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.787757 master-0 kubenswrapper[16352]: I0307 21:42:37.778260 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tslt6\" (UniqueName: \"kubernetes.io/projected/0d0a7bb8-c118-4b85-aaed-0eee4090a321-kube-api-access-tslt6\") pod \"ironic-6767bc4dd7-cp8fn\" (UID: \"0d0a7bb8-c118-4b85-aaed-0eee4090a321\") " pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:37.947508 master-0 kubenswrapper[16352]: I0307 21:42:37.947295 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:38.232641 master-0 kubenswrapper[16352]: I0307 21:42:38.231241 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:38.240520 master-0 kubenswrapper[16352]: I0307 21:42:38.240444 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-5dbd89f674-7gtrq"] Mar 07 21:42:38.241438 master-0 kubenswrapper[16352]: E0307 21:42:38.241400 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2528cb-3b05-45ee-adf7-67e32faaab12" containerName="mariadb-database-create" Mar 07 21:42:38.241506 master-0 kubenswrapper[16352]: I0307 21:42:38.241445 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2528cb-3b05-45ee-adf7-67e32faaab12" containerName="mariadb-database-create" Mar 07 21:42:38.241935 master-0 kubenswrapper[16352]: I0307 21:42:38.241906 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2528cb-3b05-45ee-adf7-67e32faaab12" containerName="mariadb-database-create" Mar 07 21:42:38.247624 master-0 kubenswrapper[16352]: I0307 21:42:38.247564 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.268416 master-0 kubenswrapper[16352]: I0307 21:42:38.268350 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts\") pod \"ac2528cb-3b05-45ee-adf7-67e32faaab12\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " Mar 07 21:42:38.268741 master-0 kubenswrapper[16352]: I0307 21:42:38.268651 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4rxz\" (UniqueName: \"kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz\") pod \"ac2528cb-3b05-45ee-adf7-67e32faaab12\" (UID: \"ac2528cb-3b05-45ee-adf7-67e32faaab12\") " Mar 07 21:42:38.268896 master-0 kubenswrapper[16352]: I0307 21:42:38.268868 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "ac2528cb-3b05-45ee-adf7-67e32faaab12" (UID: "ac2528cb-3b05-45ee-adf7-67e32faaab12"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:38.271880 master-0 kubenswrapper[16352]: I0307 21:42:38.271841 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/ac2528cb-3b05-45ee-adf7-67e32faaab12-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:38.274563 master-0 kubenswrapper[16352]: I0307 21:42:38.274509 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5dbd89f674-7gtrq"] Mar 07 21:42:38.317384 master-0 kubenswrapper[16352]: I0307 21:42:38.317305 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz" (OuterVolumeSpecName: "kube-api-access-x4rxz") pod "ac2528cb-3b05-45ee-adf7-67e32faaab12" (UID: "ac2528cb-3b05-45ee-adf7-67e32faaab12"). InnerVolumeSpecName "kube-api-access-x4rxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:38.374832 master-0 kubenswrapper[16352]: I0307 21:42:38.374764 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/3afdaf61-6be7-431c-8256-66e26e3a27a8-kube-api-access-pdgw4\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.374870 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-combined-ca-bundle\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.374908 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-config-data\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.374996 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-internal-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.375035 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-public-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.375066 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-scripts\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375140 master-0 kubenswrapper[16352]: I0307 21:42:38.375098 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3afdaf61-6be7-431c-8256-66e26e3a27a8-logs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.375365 master-0 kubenswrapper[16352]: I0307 21:42:38.375342 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4rxz\" (UniqueName: \"kubernetes.io/projected/ac2528cb-3b05-45ee-adf7-67e32faaab12-kube-api-access-x4rxz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:38.480805 master-0 kubenswrapper[16352]: I0307 21:42:38.480726 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-internal-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481126 master-0 kubenswrapper[16352]: I0307 21:42:38.480947 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-public-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481126 master-0 kubenswrapper[16352]: I0307 21:42:38.481015 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-scripts\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481126 master-0 kubenswrapper[16352]: I0307 21:42:38.481069 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3afdaf61-6be7-431c-8256-66e26e3a27a8-logs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481327 master-0 kubenswrapper[16352]: I0307 21:42:38.481252 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/3afdaf61-6be7-431c-8256-66e26e3a27a8-kube-api-access-pdgw4\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481369 master-0 kubenswrapper[16352]: I0307 21:42:38.481337 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-combined-ca-bundle\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481433 master-0 kubenswrapper[16352]: I0307 21:42:38.481372 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-config-data\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.481978 master-0 kubenswrapper[16352]: I0307 21:42:38.481913 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3afdaf61-6be7-431c-8256-66e26e3a27a8-logs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.484920 master-0 kubenswrapper[16352]: I0307 21:42:38.484805 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-scripts\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.486367 master-0 kubenswrapper[16352]: I0307 21:42:38.486320 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-public-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.487536 master-0 kubenswrapper[16352]: I0307 21:42:38.487257 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-combined-ca-bundle\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.487629 master-0 kubenswrapper[16352]: I0307 21:42:38.487538 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-config-data\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.490403 master-0 kubenswrapper[16352]: I0307 21:42:38.490350 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/3afdaf61-6be7-431c-8256-66e26e3a27a8-internal-tls-certs\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.503742 master-0 kubenswrapper[16352]: I0307 21:42:38.503670 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdgw4\" (UniqueName: \"kubernetes.io/projected/3afdaf61-6be7-431c-8256-66e26e3a27a8-kube-api-access-pdgw4\") pod \"placement-5dbd89f674-7gtrq\" (UID: \"3afdaf61-6be7-431c-8256-66e26e3a27a8\") " pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:38.521678 master-0 kubenswrapper[16352]: I0307 21:42:38.521607 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f","Type":"ContainerStarted","Data":"e5433596f1286d8c87dcf4f8f5d836406f46afbf2ebe0d1fbe3f192618cec0b0"} Mar 07 21:42:38.524344 master-0 kubenswrapper[16352]: I0307 21:42:38.524269 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" event={"ID":"e33d7a05-baac-460b-9f72-133d1f7c7b07","Type":"ContainerStarted","Data":"f568bf1ba59818260d2de9d934dd50f972a9e11ea5a20a8ddc126c606888ab7a"} Mar 07 21:42:38.524614 master-0 kubenswrapper[16352]: I0307 21:42:38.524553 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:38.529010 master-0 kubenswrapper[16352]: I0307 21:42:38.528914 16352 generic.go:334] "Generic (PLEG): container finished" podID="77f0edd2-211c-423c-b49f-d2c69df20f23" containerID="97a4dba59598ff3cd37a1a636d9fd3ae198b019c02d082f9d48e18bddc3419bf" exitCode=0 Mar 07 21:42:38.529220 master-0 kubenswrapper[16352]: I0307 21:42:38.529111 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-d904-account-create-update-tc485" event={"ID":"77f0edd2-211c-423c-b49f-d2c69df20f23","Type":"ContainerDied","Data":"97a4dba59598ff3cd37a1a636d9fd3ae198b019c02d082f9d48e18bddc3419bf"} Mar 07 21:42:38.532468 master-0 kubenswrapper[16352]: I0307 21:42:38.532428 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-create-sdzv8" event={"ID":"ac2528cb-3b05-45ee-adf7-67e32faaab12","Type":"ContainerDied","Data":"68e73d2ab90ab14c2dee7b36e0f41c1a81896f3922f496dd68207f0a2fb2824f"} Mar 07 21:42:38.532544 master-0 kubenswrapper[16352]: I0307 21:42:38.532463 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68e73d2ab90ab14c2dee7b36e0f41c1a81896f3922f496dd68207f0a2fb2824f" Mar 07 21:42:38.532544 master-0 kubenswrapper[16352]: I0307 21:42:38.532476 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-create-sdzv8" Mar 07 21:42:38.572498 master-0 kubenswrapper[16352]: I0307 21:42:38.572361 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" podStartSLOduration=6.572332558 podStartE2EDuration="6.572332558s" podCreationTimestamp="2026-03-07 21:42:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:38.553881294 +0000 UTC m=+1481.624586383" watchObservedRunningTime="2026-03-07 21:42:38.572332558 +0000 UTC m=+1481.643037627" Mar 07 21:42:38.607307 master-0 kubenswrapper[16352]: I0307 21:42:38.607033 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:40.174726 master-0 kubenswrapper[16352]: I0307 21:42:40.169003 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-86971-api-0" Mar 07 21:42:40.279726 master-0 kubenswrapper[16352]: I0307 21:42:40.277907 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-scheduler-0"] Mar 07 21:42:40.486717 master-0 kubenswrapper[16352]: I0307 21:42:40.486640 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:40.638544 master-0 kubenswrapper[16352]: I0307 21:42:40.627297 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-5dbd89f674-7gtrq"] Mar 07 21:42:40.652873 master-0 kubenswrapper[16352]: I0307 21:42:40.652815 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwhf2\" (UniqueName: \"kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2\") pod \"77f0edd2-211c-423c-b49f-d2c69df20f23\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " Mar 07 21:42:40.653014 master-0 kubenswrapper[16352]: I0307 21:42:40.652980 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts\") pod \"77f0edd2-211c-423c-b49f-d2c69df20f23\" (UID: \"77f0edd2-211c-423c-b49f-d2c69df20f23\") " Mar 07 21:42:40.654707 master-0 kubenswrapper[16352]: I0307 21:42:40.654645 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "77f0edd2-211c-423c-b49f-d2c69df20f23" (UID: "77f0edd2-211c-423c-b49f-d2c69df20f23"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:40.664241 master-0 kubenswrapper[16352]: I0307 21:42:40.660621 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-volume-lvm-iscsi-0" event={"ID":"3f8fb2d4-ec08-484a-afcc-c09964dc9c8f","Type":"ContainerStarted","Data":"44ecbdeaa43ba26e20aeec32b06a325abc1dead139dda9471b7df0dd5ed10d78"} Mar 07 21:42:40.670322 master-0 kubenswrapper[16352]: I0307 21:42:40.669218 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-6767bc4dd7-cp8fn"] Mar 07 21:42:40.674250 master-0 kubenswrapper[16352]: I0307 21:42:40.671758 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerStarted","Data":"1fdda0405b34946f50e7ae345a45635cd5ff0f1a81d18de3245af175e26b8f2b"} Mar 07 21:42:40.674250 master-0 kubenswrapper[16352]: I0307 21:42:40.673315 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2" (OuterVolumeSpecName: "kube-api-access-xwhf2") pod "77f0edd2-211c-423c-b49f-d2c69df20f23" (UID: "77f0edd2-211c-423c-b49f-d2c69df20f23"). InnerVolumeSpecName "kube-api-access-xwhf2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:40.674250 master-0 kubenswrapper[16352]: W0307 21:42:40.673892 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3afdaf61_6be7_431c_8256_66e26e3a27a8.slice/crio-8450a85aff72727a382f898baefc128681abca5f5f1f01f13a6e32c684f43273 WatchSource:0}: Error finding container 8450a85aff72727a382f898baefc128681abca5f5f1f01f13a6e32c684f43273: Status 404 returned error can't find the container with id 8450a85aff72727a382f898baefc128681abca5f5f1f01f13a6e32c684f43273 Mar 07 21:42:40.677119 master-0 kubenswrapper[16352]: I0307 21:42:40.675999 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerStarted","Data":"bd8b095b9fee43e895c23cc56f76cc4c46f7dbb28b8ddc014fbe2b2781665dbe"} Mar 07 21:42:40.677119 master-0 kubenswrapper[16352]: I0307 21:42:40.676259 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:40.684209 master-0 kubenswrapper[16352]: I0307 21:42:40.683971 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-d904-account-create-update-tc485" Mar 07 21:42:40.684209 master-0 kubenswrapper[16352]: I0307 21:42:40.684143 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-d904-account-create-update-tc485" event={"ID":"77f0edd2-211c-423c-b49f-d2c69df20f23","Type":"ContainerDied","Data":"0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0"} Mar 07 21:42:40.696795 master-0 kubenswrapper[16352]: I0307 21:42:40.684958 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0dde76913aa2a6076df4a254151f3157bae575c49aea0128a79086a5c3ea87c0" Mar 07 21:42:40.696795 master-0 kubenswrapper[16352]: I0307 21:42:40.696670 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"0badc58c-c623-44ad-8a69-7df699628dba","Type":"ContainerStarted","Data":"e65f3aa24df10fbf94be5c3a6c9e8986530e873e60eb8d287273c96644ee153b"} Mar 07 21:42:40.739958 master-0 kubenswrapper[16352]: I0307 21:42:40.733567 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-volume-lvm-iscsi-0" podStartSLOduration=5.733539875 podStartE2EDuration="5.733539875s" podCreationTimestamp="2026-03-07 21:42:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:40.704649772 +0000 UTC m=+1483.775354851" watchObservedRunningTime="2026-03-07 21:42:40.733539875 +0000 UTC m=+1483.804244934" Mar 07 21:42:40.759147 master-0 kubenswrapper[16352]: I0307 21:42:40.759041 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwhf2\" (UniqueName: \"kubernetes.io/projected/77f0edd2-211c-423c-b49f-d2c69df20f23-kube-api-access-xwhf2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:40.759147 master-0 kubenswrapper[16352]: I0307 21:42:40.759128 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/77f0edd2-211c-423c-b49f-d2c69df20f23-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:40.800295 master-0 kubenswrapper[16352]: I0307 21:42:40.799993 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:40.833985 master-0 kubenswrapper[16352]: I0307 21:42:40.830907 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-conductor-0"] Mar 07 21:42:40.874037 master-0 kubenswrapper[16352]: I0307 21:42:40.873628 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" podStartSLOduration=4.230852063 podStartE2EDuration="8.873590017s" podCreationTimestamp="2026-03-07 21:42:32 +0000 UTC" firstStartedPulling="2026-03-07 21:42:34.855847687 +0000 UTC m=+1477.926552746" lastFinishedPulling="2026-03-07 21:42:39.498585641 +0000 UTC m=+1482.569290700" observedRunningTime="2026-03-07 21:42:40.775152904 +0000 UTC m=+1483.845857963" watchObservedRunningTime="2026-03-07 21:42:40.873590017 +0000 UTC m=+1483.944295076" Mar 07 21:42:40.911750 master-0 kubenswrapper[16352]: I0307 21:42:40.911648 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-86971-backup-0"] Mar 07 21:42:41.031674 master-0 kubenswrapper[16352]: W0307 21:42:41.031599 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40fb80ab_eb5b_4a86_be9c_8d2442f5b9dd.slice/crio-42bca9fe9d3e8ce670949115341c78631daf8347f2e2d4ec0f1935dd5f9f4927 WatchSource:0}: Error finding container 42bca9fe9d3e8ce670949115341c78631daf8347f2e2d4ec0f1935dd5f9f4927: Status 404 returned error can't find the container with id 42bca9fe9d3e8ce670949115341c78631daf8347f2e2d4ec0f1935dd5f9f4927 Mar 07 21:42:41.718164 master-0 kubenswrapper[16352]: I0307 21:42:41.718048 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd","Type":"ContainerStarted","Data":"5cd87142d0426ee37841fc41d7483c6474012a57b79cda6c3942abd97e3b5ab5"} Mar 07 21:42:41.718164 master-0 kubenswrapper[16352]: I0307 21:42:41.718116 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd","Type":"ContainerStarted","Data":"42bca9fe9d3e8ce670949115341c78631daf8347f2e2d4ec0f1935dd5f9f4927"} Mar 07 21:42:41.722659 master-0 kubenswrapper[16352]: I0307 21:42:41.720742 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dbd89f674-7gtrq" event={"ID":"3afdaf61-6be7-431c-8256-66e26e3a27a8","Type":"ContainerStarted","Data":"379a26bc9446926f10705c3a005683ed9c2245d10d76f4abb005c5e9a8408ffb"} Mar 07 21:42:41.722659 master-0 kubenswrapper[16352]: I0307 21:42:41.720772 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dbd89f674-7gtrq" event={"ID":"3afdaf61-6be7-431c-8256-66e26e3a27a8","Type":"ContainerStarted","Data":"aa747e240a996f251e5913e744a05247c54558cbe63fb502cd1ff18687309db7"} Mar 07 21:42:41.722659 master-0 kubenswrapper[16352]: I0307 21:42:41.720784 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-5dbd89f674-7gtrq" event={"ID":"3afdaf61-6be7-431c-8256-66e26e3a27a8","Type":"ContainerStarted","Data":"8450a85aff72727a382f898baefc128681abca5f5f1f01f13a6e32c684f43273"} Mar 07 21:42:41.725624 master-0 kubenswrapper[16352]: I0307 21:42:41.724397 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:41.725624 master-0 kubenswrapper[16352]: I0307 21:42:41.724485 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:42:41.727398 master-0 kubenswrapper[16352]: I0307 21:42:41.727259 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6767bc4dd7-cp8fn" event={"ID":"0d0a7bb8-c118-4b85-aaed-0eee4090a321","Type":"ContainerStarted","Data":"348e0616f0be4a9afb84f42272e02dddb3eb24a2d831d90854116d8d5b04d130"} Mar 07 21:42:41.727398 master-0 kubenswrapper[16352]: I0307 21:42:41.727297 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6767bc4dd7-cp8fn" event={"ID":"0d0a7bb8-c118-4b85-aaed-0eee4090a321","Type":"ContainerStarted","Data":"c2101e0260208318987920a6adf3e36e6f148a7a28d5c0af9e403421c44d5484"} Mar 07 21:42:41.734592 master-0 kubenswrapper[16352]: I0307 21:42:41.734458 16352 generic.go:334] "Generic (PLEG): container finished" podID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerID="1fdda0405b34946f50e7ae345a45635cd5ff0f1a81d18de3245af175e26b8f2b" exitCode=0 Mar 07 21:42:41.734592 master-0 kubenswrapper[16352]: I0307 21:42:41.734517 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerDied","Data":"1fdda0405b34946f50e7ae345a45635cd5ff0f1a81d18de3245af175e26b8f2b"} Mar 07 21:42:41.748628 master-0 kubenswrapper[16352]: I0307 21:42:41.745577 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"89d6cafb612b210ef0393341371f2cd1303ef6c3dc8e15127ff0674921352df5"} Mar 07 21:42:41.750334 master-0 kubenswrapper[16352]: I0307 21:42:41.750291 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"23710c9ed9af0055504306948433bdc5ed58b97b73049adb60e5742cc0d894cd"} Mar 07 21:42:41.769337 master-0 kubenswrapper[16352]: I0307 21:42:41.766942 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"0badc58c-c623-44ad-8a69-7df699628dba","Type":"ContainerStarted","Data":"33aab1a18743878a9138d966bd0ce4441f573d4c39c26027576045f727d05256"} Mar 07 21:42:41.784371 master-0 kubenswrapper[16352]: I0307 21:42:41.784201 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-5dbd89f674-7gtrq" podStartSLOduration=3.784167355 podStartE2EDuration="3.784167355s" podCreationTimestamp="2026-03-07 21:42:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:41.754282017 +0000 UTC m=+1484.824987076" watchObservedRunningTime="2026-03-07 21:42:41.784167355 +0000 UTC m=+1484.854872414" Mar 07 21:42:42.797212 master-0 kubenswrapper[16352]: I0307 21:42:42.796849 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-scheduler-0" event={"ID":"0badc58c-c623-44ad-8a69-7df699628dba","Type":"ContainerStarted","Data":"8f0162136c49eeb335524d24c34fcf55e15654727839f8f89ffe1428d191ec46"} Mar 07 21:42:42.804641 master-0 kubenswrapper[16352]: I0307 21:42:42.804585 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-86971-backup-0" event={"ID":"40fb80ab-eb5b-4a86-be9c-8d2442f5b9dd","Type":"ContainerStarted","Data":"22ea285489825ad563745bc1e67f02536f526969115f6faf8d805806dc0d0323"} Mar 07 21:42:42.821271 master-0 kubenswrapper[16352]: I0307 21:42:42.818172 16352 generic.go:334] "Generic (PLEG): container finished" podID="0d0a7bb8-c118-4b85-aaed-0eee4090a321" containerID="348e0616f0be4a9afb84f42272e02dddb3eb24a2d831d90854116d8d5b04d130" exitCode=0 Mar 07 21:42:42.821271 master-0 kubenswrapper[16352]: I0307 21:42:42.818264 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6767bc4dd7-cp8fn" event={"ID":"0d0a7bb8-c118-4b85-aaed-0eee4090a321","Type":"ContainerDied","Data":"348e0616f0be4a9afb84f42272e02dddb3eb24a2d831d90854116d8d5b04d130"} Mar 07 21:42:42.821271 master-0 kubenswrapper[16352]: I0307 21:42:42.818299 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6767bc4dd7-cp8fn" event={"ID":"0d0a7bb8-c118-4b85-aaed-0eee4090a321","Type":"ContainerStarted","Data":"f4e75ec0eb4fcbf2d88c88ecdbf315c3ce28c7a31820edb5941ce1073fd55a74"} Mar 07 21:42:42.821271 master-0 kubenswrapper[16352]: I0307 21:42:42.818311 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-6767bc4dd7-cp8fn" event={"ID":"0d0a7bb8-c118-4b85-aaed-0eee4090a321","Type":"ContainerStarted","Data":"3ca3b6816d33f336256627d246177b1a67debd8cf62d0dab7672c5f184413832"} Mar 07 21:42:42.821271 master-0 kubenswrapper[16352]: I0307 21:42:42.819577 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:42.822147 master-0 kubenswrapper[16352]: I0307 21:42:42.822063 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-scheduler-0" podStartSLOduration=6.822042227 podStartE2EDuration="6.822042227s" podCreationTimestamp="2026-03-07 21:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:42.820124821 +0000 UTC m=+1485.890829880" watchObservedRunningTime="2026-03-07 21:42:42.822042227 +0000 UTC m=+1485.892747286" Mar 07 21:42:42.828905 master-0 kubenswrapper[16352]: I0307 21:42:42.828825 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerStarted","Data":"1beb61f08097f64d7a19ac3a3ec09b1db4f26293ecd06b468da2a35217c577bf"} Mar 07 21:42:42.855511 master-0 kubenswrapper[16352]: I0307 21:42:42.850772 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-6767bc4dd7-cp8fn" podStartSLOduration=5.850746646 podStartE2EDuration="5.850746646s" podCreationTimestamp="2026-03-07 21:42:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:42.843751988 +0000 UTC m=+1485.914457057" watchObservedRunningTime="2026-03-07 21:42:42.850746646 +0000 UTC m=+1485.921451705" Mar 07 21:42:42.934642 master-0 kubenswrapper[16352]: I0307 21:42:42.934509 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-86971-backup-0" podStartSLOduration=6.934473756 podStartE2EDuration="6.934473756s" podCreationTimestamp="2026-03-07 21:42:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:42.882366675 +0000 UTC m=+1485.953071744" watchObservedRunningTime="2026-03-07 21:42:42.934473756 +0000 UTC m=+1486.005178815" Mar 07 21:42:43.484825 master-0 kubenswrapper[16352]: I0307 21:42:43.484612 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:42:43.613090 master-0 kubenswrapper[16352]: I0307 21:42:43.612805 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:42:43.614607 master-0 kubenswrapper[16352]: I0307 21:42:43.613136 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="dnsmasq-dns" containerID="cri-o://e74f55a7aaa366ef6f2d2160a993e43cd45ad2b9ce7ad66324b3aa5c5125bc27" gracePeriod=10 Mar 07 21:42:43.669646 master-0 kubenswrapper[16352]: I0307 21:42:43.669393 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:43.806932 master-0 kubenswrapper[16352]: I0307 21:42:43.801196 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.209:5353: connect: connection refused" Mar 07 21:42:43.851510 master-0 kubenswrapper[16352]: I0307 21:42:43.844822 16352 generic.go:334] "Generic (PLEG): container finished" podID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerID="e74f55a7aaa366ef6f2d2160a993e43cd45ad2b9ce7ad66324b3aa5c5125bc27" exitCode=0 Mar 07 21:42:43.851510 master-0 kubenswrapper[16352]: I0307 21:42:43.844887 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" event={"ID":"6a84aa45-9fea-4aaa-8e68-500d08c4f625","Type":"ContainerDied","Data":"e74f55a7aaa366ef6f2d2160a993e43cd45ad2b9ce7ad66324b3aa5c5125bc27"} Mar 07 21:42:43.851510 master-0 kubenswrapper[16352]: I0307 21:42:43.849519 16352 generic.go:334] "Generic (PLEG): container finished" podID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerID="dee688209816ed52fc8614e20b0b502e502c0034b89d782cebfc9f558feca489" exitCode=1 Mar 07 21:42:43.851510 master-0 kubenswrapper[16352]: I0307 21:42:43.851352 16352 scope.go:117] "RemoveContainer" containerID="dee688209816ed52fc8614e20b0b502e502c0034b89d782cebfc9f558feca489" Mar 07 21:42:43.851815 master-0 kubenswrapper[16352]: I0307 21:42:43.851778 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerDied","Data":"dee688209816ed52fc8614e20b0b502e502c0034b89d782cebfc9f558feca489"} Mar 07 21:42:44.407306 master-0 kubenswrapper[16352]: I0307 21:42:44.407239 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.543853 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.544065 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.544240 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fgm2d\" (UniqueName: \"kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.544569 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.544590 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.546111 master-0 kubenswrapper[16352]: I0307 21:42:44.544665 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb\") pod \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\" (UID: \"6a84aa45-9fea-4aaa-8e68-500d08c4f625\") " Mar 07 21:42:44.548726 master-0 kubenswrapper[16352]: I0307 21:42:44.547857 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d" (OuterVolumeSpecName: "kube-api-access-fgm2d") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "kube-api-access-fgm2d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:44.615634 master-0 kubenswrapper[16352]: I0307 21:42:44.615554 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:44.622116 master-0 kubenswrapper[16352]: I0307 21:42:44.622007 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:44.633191 master-0 kubenswrapper[16352]: I0307 21:42:44.633124 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:44.646194 master-0 kubenswrapper[16352]: I0307 21:42:44.646107 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config" (OuterVolumeSpecName: "config") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:44.649808 master-0 kubenswrapper[16352]: I0307 21:42:44.649731 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "6a84aa45-9fea-4aaa-8e68-500d08c4f625" (UID: "6a84aa45-9fea-4aaa-8e68-500d08c4f625"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:42:44.650736 master-0 kubenswrapper[16352]: I0307 21:42:44.650665 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.650799 master-0 kubenswrapper[16352]: I0307 21:42:44.650736 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.650799 master-0 kubenswrapper[16352]: I0307 21:42:44.650757 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fgm2d\" (UniqueName: \"kubernetes.io/projected/6a84aa45-9fea-4aaa-8e68-500d08c4f625-kube-api-access-fgm2d\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.650799 master-0 kubenswrapper[16352]: I0307 21:42:44.650776 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.650799 master-0 kubenswrapper[16352]: I0307 21:42:44.650789 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.650799 master-0 kubenswrapper[16352]: I0307 21:42:44.650801 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/6a84aa45-9fea-4aaa-8e68-500d08c4f625-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:44.871941 master-0 kubenswrapper[16352]: I0307 21:42:44.871885 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" Mar 07 21:42:44.872412 master-0 kubenswrapper[16352]: I0307 21:42:44.872332 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7bbc6577f5-mldsh" event={"ID":"6a84aa45-9fea-4aaa-8e68-500d08c4f625","Type":"ContainerDied","Data":"893676acaf7cdf2a2f3b3c8d64661fa8e2c100e5b3b776e2a53ce289aab9b21e"} Mar 07 21:42:44.872473 master-0 kubenswrapper[16352]: I0307 21:42:44.872450 16352 scope.go:117] "RemoveContainer" containerID="e74f55a7aaa366ef6f2d2160a993e43cd45ad2b9ce7ad66324b3aa5c5125bc27" Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.882057 16352 generic.go:334] "Generic (PLEG): container finished" podID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerID="61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e" exitCode=1 Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.882168 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerDied","Data":"61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e"} Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.882881 16352 scope.go:117] "RemoveContainer" containerID="61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e" Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: E0307 21:42:44.883225 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-f97759bbc-nbv8w_openstack(9e3ae5f4-4a11-4c09-9831-effc4a588f9b)\"" pod="openstack/ironic-f97759bbc-nbv8w" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.885640 16352 generic.go:334] "Generic (PLEG): container finished" podID="121505c3-5091-4945-a0aa-ec97b5f45ce5" containerID="89d6cafb612b210ef0393341371f2cd1303ef6c3dc8e15127ff0674921352df5" exitCode=0 Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.885753 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerDied","Data":"89d6cafb612b210ef0393341371f2cd1303ef6c3dc8e15127ff0674921352df5"} Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.891463 16352 generic.go:334] "Generic (PLEG): container finished" podID="55b7e31a-1da5-4528-b904-db7de86e1f26" containerID="bd8b095b9fee43e895c23cc56f76cc4c46f7dbb28b8ddc014fbe2b2781665dbe" exitCode=1 Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.892774 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerDied","Data":"bd8b095b9fee43e895c23cc56f76cc4c46f7dbb28b8ddc014fbe2b2781665dbe"} Mar 07 21:42:44.904185 master-0 kubenswrapper[16352]: I0307 21:42:44.893753 16352 scope.go:117] "RemoveContainer" containerID="bd8b095b9fee43e895c23cc56f76cc4c46f7dbb28b8ddc014fbe2b2781665dbe" Mar 07 21:42:44.931166 master-0 kubenswrapper[16352]: I0307 21:42:44.931065 16352 scope.go:117] "RemoveContainer" containerID="cd39d2635e34ca1b607a0d7f795b3d69ae5bebd4d892a8aa9dd45224acee1f66" Mar 07 21:42:44.980844 master-0 kubenswrapper[16352]: I0307 21:42:44.980734 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:42:45.026945 master-0 kubenswrapper[16352]: I0307 21:42:45.003495 16352 scope.go:117] "RemoveContainer" containerID="dee688209816ed52fc8614e20b0b502e502c0034b89d782cebfc9f558feca489" Mar 07 21:42:45.026945 master-0 kubenswrapper[16352]: I0307 21:42:45.011483 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7bbc6577f5-mldsh"] Mar 07 21:42:45.207647 master-0 kubenswrapper[16352]: I0307 21:42:45.207603 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" path="/var/lib/kubelet/pods/6a84aa45-9fea-4aaa-8e68-500d08c4f625/volumes" Mar 07 21:42:45.942635 master-0 kubenswrapper[16352]: I0307 21:42:45.942571 16352 scope.go:117] "RemoveContainer" containerID="61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e" Mar 07 21:42:45.950878 master-0 kubenswrapper[16352]: E0307 21:42:45.948073 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-f97759bbc-nbv8w_openstack(9e3ae5f4-4a11-4c09-9831-effc4a588f9b)\"" pod="openstack/ironic-f97759bbc-nbv8w" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" Mar 07 21:42:45.953446 master-0 kubenswrapper[16352]: I0307 21:42:45.953391 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerStarted","Data":"7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031"} Mar 07 21:42:45.953778 master-0 kubenswrapper[16352]: I0307 21:42:45.953743 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:46.036419 master-0 kubenswrapper[16352]: I0307 21:42:46.035566 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-volume-lvm-iscsi-0" Mar 07 21:42:46.899567 master-0 kubenswrapper[16352]: I0307 21:42:46.899386 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:47.206449 master-0 kubenswrapper[16352]: I0307 21:42:47.206183 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-scheduler-0" Mar 07 21:42:47.580958 master-0 kubenswrapper[16352]: I0307 21:42:47.557843 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:47.794818 master-0 kubenswrapper[16352]: I0307 21:42:47.794595 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-86971-backup-0" Mar 07 21:42:47.834601 master-0 kubenswrapper[16352]: I0307 21:42:47.833937 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-798d5f97fb-2sbnv" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.289839 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-db-sync-hst88"] Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: E0307 21:42:48.290610 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77f0edd2-211c-423c-b49f-d2c69df20f23" containerName="mariadb-account-create-update" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.290626 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="77f0edd2-211c-423c-b49f-d2c69df20f23" containerName="mariadb-account-create-update" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: E0307 21:42:48.290680 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="init" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.290703 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="init" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: E0307 21:42:48.290722 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="dnsmasq-dns" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.290731 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="dnsmasq-dns" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.291034 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a84aa45-9fea-4aaa-8e68-500d08c4f625" containerName="dnsmasq-dns" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.291058 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="77f0edd2-211c-423c-b49f-d2c69df20f23" containerName="mariadb-account-create-update" Mar 07 21:42:48.293419 master-0 kubenswrapper[16352]: I0307 21:42:48.292085 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.304194 master-0 kubenswrapper[16352]: I0307 21:42:48.304101 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-hst88"] Mar 07 21:42:48.309706 master-0 kubenswrapper[16352]: I0307 21:42:48.307738 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 07 21:42:48.309706 master-0 kubenswrapper[16352]: I0307 21:42:48.308023 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 07 21:42:48.371878 master-0 kubenswrapper[16352]: I0307 21:42:48.368913 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.385023 master-0 kubenswrapper[16352]: I0307 21:42:48.384935 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.399161 master-0 kubenswrapper[16352]: I0307 21:42:48.385611 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.399161 master-0 kubenswrapper[16352]: I0307 21:42:48.385962 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rw4lv\" (UniqueName: \"kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.399161 master-0 kubenswrapper[16352]: I0307 21:42:48.386509 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.399161 master-0 kubenswrapper[16352]: I0307 21:42:48.386667 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.399161 master-0 kubenswrapper[16352]: I0307 21:42:48.386826 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.490253 master-0 kubenswrapper[16352]: I0307 21:42:48.490174 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rw4lv\" (UniqueName: \"kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.490622 master-0 kubenswrapper[16352]: I0307 21:42:48.490319 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491094 master-0 kubenswrapper[16352]: I0307 21:42:48.490368 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491171 master-0 kubenswrapper[16352]: I0307 21:42:48.491140 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491304 master-0 kubenswrapper[16352]: I0307 21:42:48.491230 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491304 master-0 kubenswrapper[16352]: I0307 21:42:48.491292 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491403 master-0 kubenswrapper[16352]: I0307 21:42:48.491366 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.491542 master-0 kubenswrapper[16352]: I0307 21:42:48.491465 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.492024 master-0 kubenswrapper[16352]: I0307 21:42:48.491986 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.494439 master-0 kubenswrapper[16352]: I0307 21:42:48.494414 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.514306 master-0 kubenswrapper[16352]: I0307 21:42:48.514176 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.515677 master-0 kubenswrapper[16352]: E0307 21:42:48.514645 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.515677 master-0 kubenswrapper[16352]: I0307 21:42:48.514490 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.515677 master-0 kubenswrapper[16352]: E0307 21:42:48.514870 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.515677 master-0 kubenswrapper[16352]: E0307 21:42:48.515134 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.515677 master-0 kubenswrapper[16352]: E0307 21:42:48.515294 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.516928 master-0 kubenswrapper[16352]: E0307 21:42:48.516870 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.516928 master-0 kubenswrapper[16352]: E0307 21:42:48.516917 16352 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" probeType="Readiness" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" podUID="55b7e31a-1da5-4528-b904-db7de86e1f26" containerName="ironic-neutron-agent" Mar 07 21:42:48.519281 master-0 kubenswrapper[16352]: I0307 21:42:48.519210 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rw4lv\" (UniqueName: \"kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.520842 master-0 kubenswrapper[16352]: E0307 21:42:48.520680 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" cmd=["/bin/true"] Mar 07 21:42:48.520922 master-0 kubenswrapper[16352]: E0307 21:42:48.520869 16352 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031 is running failed: container process not found" probeType="Liveness" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" podUID="55b7e31a-1da5-4528-b904-db7de86e1f26" containerName="ironic-neutron-agent" Mar 07 21:42:48.541447 master-0 kubenswrapper[16352]: I0307 21:42:48.541372 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo\") pod \"ironic-inspector-db-sync-hst88\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:48.663451 master-0 kubenswrapper[16352]: I0307 21:42:48.663296 16352 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:48.663451 master-0 kubenswrapper[16352]: I0307 21:42:48.663371 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:48.664595 master-0 kubenswrapper[16352]: I0307 21:42:48.664543 16352 scope.go:117] "RemoveContainer" containerID="61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e" Mar 07 21:42:48.665053 master-0 kubenswrapper[16352]: E0307 21:42:48.664830 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-api\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-api pod=ironic-f97759bbc-nbv8w_openstack(9e3ae5f4-4a11-4c09-9831-effc4a588f9b)\"" pod="openstack/ironic-f97759bbc-nbv8w" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" Mar 07 21:42:48.703481 master-0 kubenswrapper[16352]: I0307 21:42:48.703007 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:49.012984 master-0 kubenswrapper[16352]: I0307 21:42:49.012528 16352 generic.go:334] "Generic (PLEG): container finished" podID="55b7e31a-1da5-4528-b904-db7de86e1f26" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" exitCode=1 Mar 07 21:42:49.012984 master-0 kubenswrapper[16352]: I0307 21:42:49.012597 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerDied","Data":"7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031"} Mar 07 21:42:49.012984 master-0 kubenswrapper[16352]: I0307 21:42:49.012669 16352 scope.go:117] "RemoveContainer" containerID="bd8b095b9fee43e895c23cc56f76cc4c46f7dbb28b8ddc014fbe2b2781665dbe" Mar 07 21:42:49.014643 master-0 kubenswrapper[16352]: I0307 21:42:49.014173 16352 scope.go:117] "RemoveContainer" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" Mar 07 21:42:49.014809 master-0 kubenswrapper[16352]: E0307 21:42:49.014670 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-89874fdc8-kjtzj_openstack(55b7e31a-1da5-4528-b904-db7de86e1f26)\"" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" podUID="55b7e31a-1da5-4528-b904-db7de86e1f26" Mar 07 21:42:49.390064 master-0 kubenswrapper[16352]: I0307 21:42:49.389958 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-db-sync-hst88"] Mar 07 21:42:49.763436 master-0 kubenswrapper[16352]: I0307 21:42:49.763240 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-6767bc4dd7-cp8fn" Mar 07 21:42:49.889717 master-0 kubenswrapper[16352]: I0307 21:42:49.888982 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:49.889717 master-0 kubenswrapper[16352]: I0307 21:42:49.889392 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ironic-f97759bbc-nbv8w" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api-log" containerID="cri-o://1beb61f08097f64d7a19ac3a3ec09b1db4f26293ecd06b468da2a35217c577bf" gracePeriod=60 Mar 07 21:42:50.051104 master-0 kubenswrapper[16352]: I0307 21:42:50.051041 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-hst88" event={"ID":"4c65c147-410b-4022-9104-20fb7c362674","Type":"ContainerStarted","Data":"cea30d315ccb2026bd438c88ddad30d9033c0bb786a564cba9cb8f4ced800c5d"} Mar 07 21:42:50.056096 master-0 kubenswrapper[16352]: I0307 21:42:50.055961 16352 generic.go:334] "Generic (PLEG): container finished" podID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerID="1beb61f08097f64d7a19ac3a3ec09b1db4f26293ecd06b468da2a35217c577bf" exitCode=143 Mar 07 21:42:50.056229 master-0 kubenswrapper[16352]: I0307 21:42:50.056155 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerDied","Data":"1beb61f08097f64d7a19ac3a3ec09b1db4f26293ecd06b468da2a35217c577bf"} Mar 07 21:42:50.521566 master-0 kubenswrapper[16352]: I0307 21:42:50.521500 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699432 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699568 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699676 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699788 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699822 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699885 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699919 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m8ffg\" (UniqueName: \"kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.699990 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs\") pod \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\" (UID: \"9e3ae5f4-4a11-4c09-9831-effc4a588f9b\") " Mar 07 21:42:50.702473 master-0 kubenswrapper[16352]: I0307 21:42:50.701198 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs" (OuterVolumeSpecName: "logs") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:50.704564 master-0 kubenswrapper[16352]: I0307 21:42:50.704482 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged" (OuterVolumeSpecName: "config-data-merged") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "config-data-merged". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:50.705336 master-0 kubenswrapper[16352]: I0307 21:42:50.705293 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:50.706119 master-0 kubenswrapper[16352]: I0307 21:42:50.705860 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 21:42:50.707031 master-0 kubenswrapper[16352]: I0307 21:42:50.706997 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts" (OuterVolumeSpecName: "scripts") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:50.707700 master-0 kubenswrapper[16352]: I0307 21:42:50.707643 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg" (OuterVolumeSpecName: "kube-api-access-m8ffg") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "kube-api-access-m8ffg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:50.738855 master-0 kubenswrapper[16352]: I0307 21:42:50.738778 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data" (OuterVolumeSpecName: "config-data") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:50.803560 master-0 kubenswrapper[16352]: I0307 21:42:50.803502 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-merged\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-merged\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803560 master-0 kubenswrapper[16352]: I0307 21:42:50.803560 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803724 master-0 kubenswrapper[16352]: I0307 21:42:50.803571 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803724 master-0 kubenswrapper[16352]: I0307 21:42:50.803583 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m8ffg\" (UniqueName: \"kubernetes.io/projected/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-kube-api-access-m8ffg\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803724 master-0 kubenswrapper[16352]: I0307 21:42:50.803593 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803724 master-0 kubenswrapper[16352]: I0307 21:42:50.803627 16352 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-config-data-custom\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.803724 master-0 kubenswrapper[16352]: I0307 21:42:50.803636 16352 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:50.821059 master-0 kubenswrapper[16352]: I0307 21:42:50.821003 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "9e3ae5f4-4a11-4c09-9831-effc4a588f9b" (UID: "9e3ae5f4-4a11-4c09-9831-effc4a588f9b"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:50.906998 master-0 kubenswrapper[16352]: I0307 21:42:50.906519 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/9e3ae5f4-4a11-4c09-9831-effc4a588f9b-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:51.079571 master-0 kubenswrapper[16352]: I0307 21:42:51.079400 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-f97759bbc-nbv8w" event={"ID":"9e3ae5f4-4a11-4c09-9831-effc4a588f9b","Type":"ContainerDied","Data":"05f0ec7f7abba55d66b2d3c188ce5adf02d933efe187795c57a799255b5b2432"} Mar 07 21:42:51.079571 master-0 kubenswrapper[16352]: I0307 21:42:51.079507 16352 scope.go:117] "RemoveContainer" containerID="61883eb182d25c44bccf9c3c8a5b5f0dbdc58c65c0df3fd3fbf5d6fde28e454e" Mar 07 21:42:51.079976 master-0 kubenswrapper[16352]: I0307 21:42:51.079538 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-f97759bbc-nbv8w" Mar 07 21:42:51.132949 master-0 kubenswrapper[16352]: I0307 21:42:51.132769 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 07 21:42:51.133670 master-0 kubenswrapper[16352]: E0307 21:42:51.133629 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:51.133670 master-0 kubenswrapper[16352]: I0307 21:42:51.133662 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:51.133670 master-0 kubenswrapper[16352]: E0307 21:42:51.133705 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:51.133670 master-0 kubenswrapper[16352]: I0307 21:42:51.133717 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:51.134129 master-0 kubenswrapper[16352]: E0307 21:42:51.133745 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api-log" Mar 07 21:42:51.134129 master-0 kubenswrapper[16352]: I0307 21:42:51.133757 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api-log" Mar 07 21:42:51.134129 master-0 kubenswrapper[16352]: E0307 21:42:51.133789 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="init" Mar 07 21:42:51.134129 master-0 kubenswrapper[16352]: I0307 21:42:51.133799 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="init" Mar 07 21:42:51.134612 master-0 kubenswrapper[16352]: I0307 21:42:51.134140 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:51.134612 master-0 kubenswrapper[16352]: I0307 21:42:51.134172 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api-log" Mar 07 21:42:51.135515 master-0 kubenswrapper[16352]: I0307 21:42:51.135486 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 21:42:51.139706 master-0 kubenswrapper[16352]: I0307 21:42:51.138873 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 07 21:42:51.139831 master-0 kubenswrapper[16352]: I0307 21:42:51.139792 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 07 21:42:51.154181 master-0 kubenswrapper[16352]: I0307 21:42:51.153881 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 07 21:42:51.184837 master-0 kubenswrapper[16352]: I0307 21:42:51.184730 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:51.217952 master-0 kubenswrapper[16352]: I0307 21:42:51.217456 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-f97759bbc-nbv8w"] Mar 07 21:42:51.324661 master-0 kubenswrapper[16352]: I0307 21:42:51.322666 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.324661 master-0 kubenswrapper[16352]: I0307 21:42:51.324118 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfdb2\" (UniqueName: \"kubernetes.io/projected/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-kube-api-access-kfdb2\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.324661 master-0 kubenswrapper[16352]: I0307 21:42:51.324221 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-combined-ca-bundle\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.324661 master-0 kubenswrapper[16352]: I0307 21:42:51.324455 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config-secret\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.428885 master-0 kubenswrapper[16352]: I0307 21:42:51.428798 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-combined-ca-bundle\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.429211 master-0 kubenswrapper[16352]: I0307 21:42:51.428949 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config-secret\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.429211 master-0 kubenswrapper[16352]: I0307 21:42:51.429098 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.429555 master-0 kubenswrapper[16352]: I0307 21:42:51.429232 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfdb2\" (UniqueName: \"kubernetes.io/projected/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-kube-api-access-kfdb2\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.431064 master-0 kubenswrapper[16352]: I0307 21:42:51.431014 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.435208 master-0 kubenswrapper[16352]: I0307 21:42:51.435132 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-combined-ca-bundle\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.435541 master-0 kubenswrapper[16352]: I0307 21:42:51.435492 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-openstack-config-secret\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.452583 master-0 kubenswrapper[16352]: I0307 21:42:51.452506 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfdb2\" (UniqueName: \"kubernetes.io/projected/05fd5d5c-a1d5-49d5-bd52-189f40a2dc43-kube-api-access-kfdb2\") pod \"openstackclient\" (UID: \"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43\") " pod="openstack/openstackclient" Mar 07 21:42:51.472911 master-0 kubenswrapper[16352]: I0307 21:42:51.472839 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 07 21:42:51.806045 master-0 kubenswrapper[16352]: I0307 21:42:51.805803 16352 scope.go:117] "RemoveContainer" containerID="1beb61f08097f64d7a19ac3a3ec09b1db4f26293ecd06b468da2a35217c577bf" Mar 07 21:42:51.879247 master-0 kubenswrapper[16352]: I0307 21:42:51.878994 16352 scope.go:117] "RemoveContainer" containerID="1fdda0405b34946f50e7ae345a45635cd5ff0f1a81d18de3245af175e26b8f2b" Mar 07 21:42:52.396010 master-0 kubenswrapper[16352]: I0307 21:42:52.391269 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 07 21:42:53.119793 master-0 kubenswrapper[16352]: I0307 21:42:53.118878 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-hst88" event={"ID":"4c65c147-410b-4022-9104-20fb7c362674","Type":"ContainerStarted","Data":"1544aaf023ac14ad99649dc624366028793977da9454b79d8e5eba2a3a57847e"} Mar 07 21:42:53.122857 master-0 kubenswrapper[16352]: I0307 21:42:53.122778 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43","Type":"ContainerStarted","Data":"01393ac0a42a9696fbe6347adfa13571a542ea9021233ac7e64ae8409aafdf3c"} Mar 07 21:42:53.160726 master-0 kubenswrapper[16352]: I0307 21:42:53.154473 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-db-sync-hst88" podStartSLOduration=2.651461952 podStartE2EDuration="5.154448059s" podCreationTimestamp="2026-03-07 21:42:48 +0000 UTC" firstStartedPulling="2026-03-07 21:42:49.40455032 +0000 UTC m=+1492.475255369" lastFinishedPulling="2026-03-07 21:42:51.907536407 +0000 UTC m=+1494.978241476" observedRunningTime="2026-03-07 21:42:53.142651416 +0000 UTC m=+1496.213356495" watchObservedRunningTime="2026-03-07 21:42:53.154448059 +0000 UTC m=+1496.225153108" Mar 07 21:42:53.214062 master-0 kubenswrapper[16352]: I0307 21:42:53.213969 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" path="/var/lib/kubelet/pods/9e3ae5f4-4a11-4c09-9831-effc4a588f9b/volumes" Mar 07 21:42:53.509972 master-0 kubenswrapper[16352]: I0307 21:42:53.509784 16352 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:42:53.511356 master-0 kubenswrapper[16352]: I0307 21:42:53.511319 16352 scope.go:117] "RemoveContainer" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" Mar 07 21:42:53.511727 master-0 kubenswrapper[16352]: E0307 21:42:53.511676 16352 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ironic-neutron-agent\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ironic-neutron-agent pod=ironic-neutron-agent-89874fdc8-kjtzj_openstack(55b7e31a-1da5-4528-b904-db7de86e1f26)\"" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" podUID="55b7e31a-1da5-4528-b904-db7de86e1f26" Mar 07 21:42:54.630630 master-0 kubenswrapper[16352]: I0307 21:42:54.630553 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:42:55.194727 master-0 kubenswrapper[16352]: I0307 21:42:55.194627 16352 generic.go:334] "Generic (PLEG): container finished" podID="4c65c147-410b-4022-9104-20fb7c362674" containerID="1544aaf023ac14ad99649dc624366028793977da9454b79d8e5eba2a3a57847e" exitCode=0 Mar 07 21:42:55.240260 master-0 kubenswrapper[16352]: I0307 21:42:55.239348 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-hst88" event={"ID":"4c65c147-410b-4022-9104-20fb7c362674","Type":"ContainerDied","Data":"1544aaf023ac14ad99649dc624366028793977da9454b79d8e5eba2a3a57847e"} Mar 07 21:42:55.240260 master-0 kubenswrapper[16352]: I0307 21:42:55.239413 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-7b675b8b94-rfvgr"] Mar 07 21:42:55.240260 master-0 kubenswrapper[16352]: I0307 21:42:55.240128 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e3ae5f4-4a11-4c09-9831-effc4a588f9b" containerName="ironic-api" Mar 07 21:42:55.241887 master-0 kubenswrapper[16352]: I0307 21:42:55.241512 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.244961 master-0 kubenswrapper[16352]: I0307 21:42:55.244532 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 07 21:42:55.244961 master-0 kubenswrapper[16352]: I0307 21:42:55.244819 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b675b8b94-rfvgr"] Mar 07 21:42:55.247652 master-0 kubenswrapper[16352]: I0307 21:42:55.247605 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 07 21:42:55.248890 master-0 kubenswrapper[16352]: I0307 21:42:55.247726 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259183 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-log-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259279 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-config-data\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259353 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-etc-swift\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259401 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2rn8\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-kube-api-access-h2rn8\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259453 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-public-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259473 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-internal-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259518 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-run-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.259874 master-0 kubenswrapper[16352]: I0307 21:42:55.259578 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-combined-ca-bundle\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.362862 master-0 kubenswrapper[16352]: I0307 21:42:55.362763 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-etc-swift\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.363268 master-0 kubenswrapper[16352]: I0307 21:42:55.363235 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2rn8\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-kube-api-access-h2rn8\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.363403 master-0 kubenswrapper[16352]: I0307 21:42:55.363375 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-public-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.363663 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-internal-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.363820 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-run-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.363910 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-combined-ca-bundle\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.363983 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-log-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.364210 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-config-data\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.366227 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-log-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.366230 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a43b13cc-0d91-4625-91cb-003ccf32af78-run-httpd\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.367097 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-etc-swift\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.368100 master-0 kubenswrapper[16352]: I0307 21:42:55.368097 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-public-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.379172 master-0 kubenswrapper[16352]: I0307 21:42:55.372383 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-internal-tls-certs\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.379172 master-0 kubenswrapper[16352]: I0307 21:42:55.372738 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-config-data\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.379172 master-0 kubenswrapper[16352]: I0307 21:42:55.375496 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a43b13cc-0d91-4625-91cb-003ccf32af78-combined-ca-bundle\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.387743 master-0 kubenswrapper[16352]: I0307 21:42:55.387620 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2rn8\" (UniqueName: \"kubernetes.io/projected/a43b13cc-0d91-4625-91cb-003ccf32af78-kube-api-access-h2rn8\") pod \"swift-proxy-7b675b8b94-rfvgr\" (UID: \"a43b13cc-0d91-4625-91cb-003ccf32af78\") " pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:55.578858 master-0 kubenswrapper[16352]: I0307 21:42:55.578664 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:56.132439 master-0 kubenswrapper[16352]: I0307 21:42:56.131474 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-7b675b8b94-rfvgr"] Mar 07 21:42:56.230082 master-0 kubenswrapper[16352]: I0307 21:42:56.230014 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b675b8b94-rfvgr" event={"ID":"a43b13cc-0d91-4625-91cb-003ccf32af78","Type":"ContainerStarted","Data":"7025a51f743bfc2da3121ecf9ac5029e1f43bdc49e5df291c02aed2ae95218b3"} Mar 07 21:42:56.712470 master-0 kubenswrapper[16352]: I0307 21:42:56.712421 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:56.814342 master-0 kubenswrapper[16352]: I0307 21:42:56.814227 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.814672 master-0 kubenswrapper[16352]: I0307 21:42:56.814580 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.814842 master-0 kubenswrapper[16352]: I0307 21:42:56.814769 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.814842 master-0 kubenswrapper[16352]: I0307 21:42:56.814827 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.814970 master-0 kubenswrapper[16352]: I0307 21:42:56.814945 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rw4lv\" (UniqueName: \"kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.815061 master-0 kubenswrapper[16352]: I0307 21:42:56.814995 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.816044 master-0 kubenswrapper[16352]: I0307 21:42:56.815113 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic\") pod \"4c65c147-410b-4022-9104-20fb7c362674\" (UID: \"4c65c147-410b-4022-9104-20fb7c362674\") " Mar 07 21:42:56.816044 master-0 kubenswrapper[16352]: I0307 21:42:56.815221 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:56.816044 master-0 kubenswrapper[16352]: I0307 21:42:56.815636 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:42:56.816399 master-0 kubenswrapper[16352]: I0307 21:42:56.816331 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.816399 master-0 kubenswrapper[16352]: I0307 21:42:56.816376 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/4c65c147-410b-4022-9104-20fb7c362674-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.818412 master-0 kubenswrapper[16352]: I0307 21:42:56.818021 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 21:42:56.820585 master-0 kubenswrapper[16352]: I0307 21:42:56.820480 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv" (OuterVolumeSpecName: "kube-api-access-rw4lv") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "kube-api-access-rw4lv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:42:56.820891 master-0 kubenswrapper[16352]: I0307 21:42:56.820837 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts" (OuterVolumeSpecName: "scripts") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:56.868511 master-0 kubenswrapper[16352]: I0307 21:42:56.868383 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config" (OuterVolumeSpecName: "config") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:56.874878 master-0 kubenswrapper[16352]: I0307 21:42:56.874794 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4c65c147-410b-4022-9104-20fb7c362674" (UID: "4c65c147-410b-4022-9104-20fb7c362674"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:42:56.919796 master-0 kubenswrapper[16352]: I0307 21:42:56.918824 16352 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/4c65c147-410b-4022-9104-20fb7c362674-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.919796 master-0 kubenswrapper[16352]: I0307 21:42:56.918883 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.919796 master-0 kubenswrapper[16352]: I0307 21:42:56.918897 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.919796 master-0 kubenswrapper[16352]: I0307 21:42:56.918909 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rw4lv\" (UniqueName: \"kubernetes.io/projected/4c65c147-410b-4022-9104-20fb7c362674-kube-api-access-rw4lv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:56.919796 master-0 kubenswrapper[16352]: I0307 21:42:56.918921 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/4c65c147-410b-4022-9104-20fb7c362674-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:42:57.257556 master-0 kubenswrapper[16352]: I0307 21:42:57.257478 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-db-sync-hst88" event={"ID":"4c65c147-410b-4022-9104-20fb7c362674","Type":"ContainerDied","Data":"cea30d315ccb2026bd438c88ddad30d9033c0bb786a564cba9cb8f4ced800c5d"} Mar 07 21:42:57.257556 master-0 kubenswrapper[16352]: I0307 21:42:57.257549 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cea30d315ccb2026bd438c88ddad30d9033c0bb786a564cba9cb8f4ced800c5d" Mar 07 21:42:57.258238 master-0 kubenswrapper[16352]: I0307 21:42:57.257637 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-db-sync-hst88" Mar 07 21:42:57.260262 master-0 kubenswrapper[16352]: I0307 21:42:57.260171 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b675b8b94-rfvgr" event={"ID":"a43b13cc-0d91-4625-91cb-003ccf32af78","Type":"ContainerStarted","Data":"a1f1d1492a1b18b80670297c5360f2c6c52a2163d2f57eb592a4d6d8d4aa9c49"} Mar 07 21:42:57.260262 master-0 kubenswrapper[16352]: I0307 21:42:57.260208 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-7b675b8b94-rfvgr" event={"ID":"a43b13cc-0d91-4625-91cb-003ccf32af78","Type":"ContainerStarted","Data":"1f3135a26eb746853a4ccb34a2b4075c5d6bad7cd3dc77720d43f3ce8b2fed57"} Mar 07 21:42:57.260833 master-0 kubenswrapper[16352]: I0307 21:42:57.260367 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:57.260833 master-0 kubenswrapper[16352]: I0307 21:42:57.260718 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:42:57.611339 master-0 kubenswrapper[16352]: I0307 21:42:57.611137 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-7b675b8b94-rfvgr" podStartSLOduration=2.611112156 podStartE2EDuration="2.611112156s" podCreationTimestamp="2026-03-07 21:42:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:42:57.57048256 +0000 UTC m=+1500.641187619" watchObservedRunningTime="2026-03-07 21:42:57.611112156 +0000 UTC m=+1500.681817215" Mar 07 21:42:58.642190 master-0 kubenswrapper[16352]: I0307 21:42:58.642110 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-fd8d8c7c7-w5vwh" Mar 07 21:42:58.991817 master-0 kubenswrapper[16352]: I0307 21:42:58.977180 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:42:59.043752 master-0 kubenswrapper[16352]: I0307 21:42:59.028382 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f49f69884-v8xz2" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-httpd" containerID="cri-o://71178167e291da054ac2196c37ee9b7fc49465fe3da171034ec208c32399bdcf" gracePeriod=30 Mar 07 21:42:59.043752 master-0 kubenswrapper[16352]: I0307 21:42:59.033107 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-f49f69884-v8xz2" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-api" containerID="cri-o://fa9fa60fa174582ce2847403b83e63de271e277c86bae70067df78a0f025b6ee" gracePeriod=30 Mar 07 21:42:59.270516 master-0 kubenswrapper[16352]: I0307 21:42:59.269312 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:42:59.270516 master-0 kubenswrapper[16352]: E0307 21:42:59.269837 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c65c147-410b-4022-9104-20fb7c362674" containerName="ironic-inspector-db-sync" Mar 07 21:42:59.270516 master-0 kubenswrapper[16352]: I0307 21:42:59.269855 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c65c147-410b-4022-9104-20fb7c362674" containerName="ironic-inspector-db-sync" Mar 07 21:42:59.270516 master-0 kubenswrapper[16352]: I0307 21:42:59.270156 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c65c147-410b-4022-9104-20fb7c362674" containerName="ironic-inspector-db-sync" Mar 07 21:42:59.272294 master-0 kubenswrapper[16352]: I0307 21:42:59.271646 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.280741 master-0 kubenswrapper[16352]: I0307 21:42:59.280525 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:42:59.346172 master-0 kubenswrapper[16352]: I0307 21:42:59.346040 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dfs6m\" (UniqueName: \"kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.346172 master-0 kubenswrapper[16352]: I0307 21:42:59.346126 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.347249 master-0 kubenswrapper[16352]: I0307 21:42:59.346492 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.347249 master-0 kubenswrapper[16352]: I0307 21:42:59.346826 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.348593 master-0 kubenswrapper[16352]: I0307 21:42:59.347870 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.348593 master-0 kubenswrapper[16352]: I0307 21:42:59.347958 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.437525 master-0 kubenswrapper[16352]: I0307 21:42:59.437026 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:42:59.470967 master-0 kubenswrapper[16352]: I0307 21:42:59.460156 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:42:59.470967 master-0 kubenswrapper[16352]: I0307 21:42:59.468690 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 07 21:42:59.470967 master-0 kubenswrapper[16352]: I0307 21:42:59.469239 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 07 21:42:59.470967 master-0 kubenswrapper[16352]: I0307 21:42:59.469781 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 07 21:42:59.482174 master-0 kubenswrapper[16352]: I0307 21:42:59.481808 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:42:59.493567 master-0 kubenswrapper[16352]: I0307 21:42:59.492983 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.493567 master-0 kubenswrapper[16352]: I0307 21:42:59.493289 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.494717 master-0 kubenswrapper[16352]: I0307 21:42:59.493821 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dfs6m\" (UniqueName: \"kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.494717 master-0 kubenswrapper[16352]: I0307 21:42:59.493878 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.494717 master-0 kubenswrapper[16352]: I0307 21:42:59.493982 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.494717 master-0 kubenswrapper[16352]: I0307 21:42:59.494035 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.506542 master-0 kubenswrapper[16352]: I0307 21:42:59.495284 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.506542 master-0 kubenswrapper[16352]: I0307 21:42:59.495995 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.506542 master-0 kubenswrapper[16352]: I0307 21:42:59.496545 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.506542 master-0 kubenswrapper[16352]: I0307 21:42:59.497284 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.506542 master-0 kubenswrapper[16352]: I0307 21:42:59.499223 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.590759 master-0 kubenswrapper[16352]: I0307 21:42:59.581150 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dfs6m\" (UniqueName: \"kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m\") pod \"dnsmasq-dns-7754f44b87-jrdnd\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.625896 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.626289 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-79qk4\" (UniqueName: \"kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.626447 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.626548 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.626709 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.626801 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.630370 master-0 kubenswrapper[16352]: I0307 21:42:59.627017 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.656883 master-0 kubenswrapper[16352]: I0307 21:42:59.656707 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-94ssk"] Mar 07 21:42:59.664648 master-0 kubenswrapper[16352]: I0307 21:42:59.658872 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.726203 master-0 kubenswrapper[16352]: I0307 21:42:59.720740 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:42:59.738709 master-0 kubenswrapper[16352]: I0307 21:42:59.737399 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-94ssk"] Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.749883 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfstw\" (UniqueName: \"kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.749975 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750039 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750108 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-79qk4\" (UniqueName: \"kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750130 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750158 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750212 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750254 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.750514 master-0 kubenswrapper[16352]: I0307 21:42:59.750278 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.757889 master-0 kubenswrapper[16352]: I0307 21:42:59.757533 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.758091 master-0 kubenswrapper[16352]: I0307 21:42:59.758047 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.758404 master-0 kubenswrapper[16352]: I0307 21:42:59.758359 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.764280 master-0 kubenswrapper[16352]: I0307 21:42:59.763833 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.771721 master-0 kubenswrapper[16352]: I0307 21:42:59.767724 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.771721 master-0 kubenswrapper[16352]: I0307 21:42:59.768346 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.778099 master-0 kubenswrapper[16352]: I0307 21:42:59.777942 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-64285"] Mar 07 21:42:59.783256 master-0 kubenswrapper[16352]: I0307 21:42:59.780258 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-64285" Mar 07 21:42:59.786310 master-0 kubenswrapper[16352]: I0307 21:42:59.786212 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-79qk4\" (UniqueName: \"kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4\") pod \"ironic-inspector-0\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " pod="openstack/ironic-inspector-0" Mar 07 21:42:59.789496 master-0 kubenswrapper[16352]: I0307 21:42:59.789446 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:42:59.817159 master-0 kubenswrapper[16352]: I0307 21:42:59.817070 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-64285"] Mar 07 21:42:59.847779 master-0 kubenswrapper[16352]: I0307 21:42:59.846509 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-8a73-account-create-update-s57x2"] Mar 07 21:42:59.850700 master-0 kubenswrapper[16352]: I0307 21:42:59.849203 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:42:59.851786 master-0 kubenswrapper[16352]: I0307 21:42:59.851740 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 07 21:42:59.854031 master-0 kubenswrapper[16352]: I0307 21:42:59.853957 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ws6tk\" (UniqueName: \"kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:42:59.854202 master-0 kubenswrapper[16352]: I0307 21:42:59.854175 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.854267 master-0 kubenswrapper[16352]: I0307 21:42:59.854216 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:42:59.854321 master-0 kubenswrapper[16352]: I0307 21:42:59.854298 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfstw\" (UniqueName: \"kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.855397 master-0 kubenswrapper[16352]: I0307 21:42:59.855358 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.871546 master-0 kubenswrapper[16352]: I0307 21:42:59.863205 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8a73-account-create-update-s57x2"] Mar 07 21:42:59.882404 master-0 kubenswrapper[16352]: I0307 21:42:59.882352 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfstw\" (UniqueName: \"kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw\") pod \"nova-api-db-create-94ssk\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " pod="openstack/nova-api-db-create-94ssk" Mar 07 21:42:59.905780 master-0 kubenswrapper[16352]: I0307 21:42:59.903724 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-26xt5"] Mar 07 21:42:59.907648 master-0 kubenswrapper[16352]: I0307 21:42:59.907616 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:42:59.925793 master-0 kubenswrapper[16352]: I0307 21:42:59.925565 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-26xt5"] Mar 07 21:42:59.982095 master-0 kubenswrapper[16352]: I0307 21:42:59.979577 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ws6tk\" (UniqueName: \"kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:42:59.982095 master-0 kubenswrapper[16352]: I0307 21:42:59.979764 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:42:59.982095 master-0 kubenswrapper[16352]: I0307 21:42:59.979907 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sxs8f\" (UniqueName: \"kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:42:59.982095 master-0 kubenswrapper[16352]: I0307 21:42:59.981466 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:42:59.989545 master-0 kubenswrapper[16352]: I0307 21:42:59.989483 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-94ssk" Mar 07 21:43:00.006375 master-0 kubenswrapper[16352]: I0307 21:43:00.005176 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:43:00.023323 master-0 kubenswrapper[16352]: I0307 21:43:00.022076 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ws6tk\" (UniqueName: \"kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk\") pod \"nova-cell0-db-create-64285\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " pod="openstack/nova-cell0-db-create-64285" Mar 07 21:43:00.037293 master-0 kubenswrapper[16352]: I0307 21:43:00.036940 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-0300-account-create-update-b66m5"] Mar 07 21:43:00.039205 master-0 kubenswrapper[16352]: I0307 21:43:00.039027 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.063969 master-0 kubenswrapper[16352]: I0307 21:43:00.058285 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 07 21:43:00.063969 master-0 kubenswrapper[16352]: I0307 21:43:00.063044 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0300-account-create-update-b66m5"] Mar 07 21:43:00.085010 master-0 kubenswrapper[16352]: I0307 21:43:00.084950 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6jtvv\" (UniqueName: \"kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.085259 master-0 kubenswrapper[16352]: I0307 21:43:00.085078 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:00.085259 master-0 kubenswrapper[16352]: I0307 21:43:00.085115 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sxs8f\" (UniqueName: \"kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:00.085259 master-0 kubenswrapper[16352]: I0307 21:43:00.085243 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.086494 master-0 kubenswrapper[16352]: I0307 21:43:00.086466 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:00.115646 master-0 kubenswrapper[16352]: I0307 21:43:00.115581 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sxs8f\" (UniqueName: \"kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f\") pod \"nova-api-8a73-account-create-update-s57x2\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:00.128778 master-0 kubenswrapper[16352]: I0307 21:43:00.127020 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-2b75-account-create-update-gqckp"] Mar 07 21:43:00.131335 master-0 kubenswrapper[16352]: I0307 21:43:00.131289 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.134401 master-0 kubenswrapper[16352]: I0307 21:43:00.134291 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 07 21:43:00.175691 master-0 kubenswrapper[16352]: I0307 21:43:00.175571 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2b75-account-create-update-gqckp"] Mar 07 21:43:00.188078 master-0 kubenswrapper[16352]: I0307 21:43:00.188002 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6jtvv\" (UniqueName: \"kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.188338 master-0 kubenswrapper[16352]: I0307 21:43:00.188198 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nvj5n\" (UniqueName: \"kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.188338 master-0 kubenswrapper[16352]: I0307 21:43:00.188244 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.188338 master-0 kubenswrapper[16352]: I0307 21:43:00.188293 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.189751 master-0 kubenswrapper[16352]: I0307 21:43:00.189699 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.213854 master-0 kubenswrapper[16352]: I0307 21:43:00.213758 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6jtvv\" (UniqueName: \"kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv\") pod \"nova-cell1-db-create-26xt5\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.291517 master-0 kubenswrapper[16352]: I0307 21:43:00.291347 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkwl9\" (UniqueName: \"kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.292303 master-0 kubenswrapper[16352]: I0307 21:43:00.292252 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-64285" Mar 07 21:43:00.292611 master-0 kubenswrapper[16352]: I0307 21:43:00.292558 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nvj5n\" (UniqueName: \"kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.292882 master-0 kubenswrapper[16352]: I0307 21:43:00.292852 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.293471 master-0 kubenswrapper[16352]: I0307 21:43:00.293449 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.294292 master-0 kubenswrapper[16352]: I0307 21:43:00.294119 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.306946 master-0 kubenswrapper[16352]: I0307 21:43:00.306698 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:00.313047 master-0 kubenswrapper[16352]: I0307 21:43:00.312994 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nvj5n\" (UniqueName: \"kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n\") pod \"nova-cell0-0300-account-create-update-b66m5\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.389710 master-0 kubenswrapper[16352]: I0307 21:43:00.389150 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:00.396495 master-0 kubenswrapper[16352]: I0307 21:43:00.396439 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.397920 master-0 kubenswrapper[16352]: I0307 21:43:00.397663 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.397987 master-0 kubenswrapper[16352]: I0307 21:43:00.397970 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkwl9\" (UniqueName: \"kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.415636 master-0 kubenswrapper[16352]: I0307 21:43:00.415569 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:00.422726 master-0 kubenswrapper[16352]: I0307 21:43:00.422648 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkwl9\" (UniqueName: \"kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9\") pod \"nova-cell1-2b75-account-create-update-gqckp\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:00.465570 master-0 kubenswrapper[16352]: I0307 21:43:00.465476 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:03.514285 master-0 kubenswrapper[16352]: I0307 21:43:03.514199 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:05.454751 master-0 kubenswrapper[16352]: I0307 21:43:05.454527 16352 generic.go:334] "Generic (PLEG): container finished" podID="8c877c04-56be-4df0-b751-4691351e9f5d" containerID="71178167e291da054ac2196c37ee9b7fc49465fe3da171034ec208c32399bdcf" exitCode=0 Mar 07 21:43:05.454751 master-0 kubenswrapper[16352]: I0307 21:43:05.454588 16352 generic.go:334] "Generic (PLEG): container finished" podID="8c877c04-56be-4df0-b751-4691351e9f5d" containerID="fa9fa60fa174582ce2847403b83e63de271e277c86bae70067df78a0f025b6ee" exitCode=0 Mar 07 21:43:05.454751 master-0 kubenswrapper[16352]: I0307 21:43:05.454618 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerDied","Data":"71178167e291da054ac2196c37ee9b7fc49465fe3da171034ec208c32399bdcf"} Mar 07 21:43:05.454751 master-0 kubenswrapper[16352]: I0307 21:43:05.454657 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerDied","Data":"fa9fa60fa174582ce2847403b83e63de271e277c86bae70067df78a0f025b6ee"} Mar 07 21:43:05.595322 master-0 kubenswrapper[16352]: I0307 21:43:05.595251 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:43:05.617845 master-0 kubenswrapper[16352]: I0307 21:43:05.617605 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-7b675b8b94-rfvgr" Mar 07 21:43:06.715161 master-0 kubenswrapper[16352]: I0307 21:43:06.715058 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:06.715861 master-0 kubenswrapper[16352]: I0307 21:43:06.715670 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-213eb-default-external-api-0" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-log" containerID="cri-o://919ca7a10f7d2057908b264c89ded111932385985e0c2acc946a7e48c2af77dc" gracePeriod=30 Mar 07 21:43:06.716426 master-0 kubenswrapper[16352]: I0307 21:43:06.716070 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-213eb-default-external-api-0" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-httpd" containerID="cri-o://2ce34483211f1234aa598d6b4877465bf2b3da6001d0f92033be8f259c68d8a2" gracePeriod=30 Mar 07 21:43:07.576499 master-0 kubenswrapper[16352]: I0307 21:43:07.576339 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:07.577165 master-0 kubenswrapper[16352]: I0307 21:43:07.577040 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-213eb-default-internal-api-0" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-log" containerID="cri-o://58b019180d18af9978ed984fa8ea3f3388b3fd37aa0ec168b38ef005be8346d2" gracePeriod=30 Mar 07 21:43:07.577664 master-0 kubenswrapper[16352]: I0307 21:43:07.577558 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-213eb-default-internal-api-0" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-httpd" containerID="cri-o://d75b2a2e94bc7bcc078b90db1a505b9b3425739b0825ebdd8b77bcc6009b212e" gracePeriod=30 Mar 07 21:43:09.192357 master-0 kubenswrapper[16352]: I0307 21:43:09.192250 16352 scope.go:117] "RemoveContainer" containerID="7cbabceeaaf5d4439754f51f551dc58268ac32bc0a835865f49e6eb911d57031" Mar 07 21:43:09.928870 master-0 kubenswrapper[16352]: I0307 21:43:09.928783 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:43:10.025497 master-0 kubenswrapper[16352]: I0307 21:43:10.025386 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-5dbd89f674-7gtrq" Mar 07 21:43:10.148711 master-0 kubenswrapper[16352]: I0307 21:43:10.147963 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:43:10.148711 master-0 kubenswrapper[16352]: I0307 21:43:10.148391 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cc7544794-vmcq4" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-log" containerID="cri-o://56401df6f757863c0a67a6d2486cdef43b3b8a5aea7f0f3263e07faea84a7d68" gracePeriod=30 Mar 07 21:43:10.148711 master-0 kubenswrapper[16352]: I0307 21:43:10.148600 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-6cc7544794-vmcq4" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-api" containerID="cri-o://a039f4a32907524546ecdece0f12ff81c8e16de6f2c7d7dbc0e3d8831b933b8c" gracePeriod=30 Mar 07 21:43:13.101292 master-0 kubenswrapper[16352]: I0307 21:43:13.101157 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:43:13.255218 master-0 kubenswrapper[16352]: I0307 21:43:13.244030 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config\") pod \"8c877c04-56be-4df0-b751-4691351e9f5d\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " Mar 07 21:43:13.255218 master-0 kubenswrapper[16352]: I0307 21:43:13.244288 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs\") pod \"8c877c04-56be-4df0-b751-4691351e9f5d\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " Mar 07 21:43:13.255218 master-0 kubenswrapper[16352]: I0307 21:43:13.244465 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8rdh\" (UniqueName: \"kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh\") pod \"8c877c04-56be-4df0-b751-4691351e9f5d\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " Mar 07 21:43:13.255218 master-0 kubenswrapper[16352]: I0307 21:43:13.244574 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle\") pod \"8c877c04-56be-4df0-b751-4691351e9f5d\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " Mar 07 21:43:13.255218 master-0 kubenswrapper[16352]: I0307 21:43:13.244684 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config\") pod \"8c877c04-56be-4df0-b751-4691351e9f5d\" (UID: \"8c877c04-56be-4df0-b751-4691351e9f5d\") " Mar 07 21:43:13.275793 master-0 kubenswrapper[16352]: I0307 21:43:13.274047 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh" (OuterVolumeSpecName: "kube-api-access-h8rdh") pod "8c877c04-56be-4df0-b751-4691351e9f5d" (UID: "8c877c04-56be-4df0-b751-4691351e9f5d"). InnerVolumeSpecName "kube-api-access-h8rdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:13.289748 master-0 kubenswrapper[16352]: I0307 21:43:13.288123 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "8c877c04-56be-4df0-b751-4691351e9f5d" (UID: "8c877c04-56be-4df0-b751-4691351e9f5d"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:13.359863 master-0 kubenswrapper[16352]: I0307 21:43:13.354295 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8rdh\" (UniqueName: \"kubernetes.io/projected/8c877c04-56be-4df0-b751-4691351e9f5d-kube-api-access-h8rdh\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:13.359863 master-0 kubenswrapper[16352]: I0307 21:43:13.354340 16352 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-httpd-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:13.394136 master-0 kubenswrapper[16352]: I0307 21:43:13.380592 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config" (OuterVolumeSpecName: "config") pod "8c877c04-56be-4df0-b751-4691351e9f5d" (UID: "8c877c04-56be-4df0-b751-4691351e9f5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:13.413714 master-0 kubenswrapper[16352]: I0307 21:43:13.411824 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8c877c04-56be-4df0-b751-4691351e9f5d" (UID: "8c877c04-56be-4df0-b751-4691351e9f5d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:13.442975 master-0 kubenswrapper[16352]: I0307 21:43:13.430571 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "8c877c04-56be-4df0-b751-4691351e9f5d" (UID: "8c877c04-56be-4df0-b751-4691351e9f5d"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:13.461963 master-0 kubenswrapper[16352]: I0307 21:43:13.461775 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:13.461963 master-0 kubenswrapper[16352]: I0307 21:43:13.461842 16352 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-ovndb-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:13.461963 master-0 kubenswrapper[16352]: I0307 21:43:13.461862 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8c877c04-56be-4df0-b751-4691351e9f5d-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:13.660153 master-0 kubenswrapper[16352]: I0307 21:43:13.659611 16352 generic.go:334] "Generic (PLEG): container finished" podID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerID="a039f4a32907524546ecdece0f12ff81c8e16de6f2c7d7dbc0e3d8831b933b8c" exitCode=0 Mar 07 21:43:13.660153 master-0 kubenswrapper[16352]: I0307 21:43:13.659653 16352 generic.go:334] "Generic (PLEG): container finished" podID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerID="56401df6f757863c0a67a6d2486cdef43b3b8a5aea7f0f3263e07faea84a7d68" exitCode=143 Mar 07 21:43:13.660153 master-0 kubenswrapper[16352]: I0307 21:43:13.659728 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerDied","Data":"a039f4a32907524546ecdece0f12ff81c8e16de6f2c7d7dbc0e3d8831b933b8c"} Mar 07 21:43:13.660153 master-0 kubenswrapper[16352]: I0307 21:43:13.659757 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerDied","Data":"56401df6f757863c0a67a6d2486cdef43b3b8a5aea7f0f3263e07faea84a7d68"} Mar 07 21:43:13.665572 master-0 kubenswrapper[16352]: I0307 21:43:13.665542 16352 generic.go:334] "Generic (PLEG): container finished" podID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerID="d75b2a2e94bc7bcc078b90db1a505b9b3425739b0825ebdd8b77bcc6009b212e" exitCode=0 Mar 07 21:43:13.665572 master-0 kubenswrapper[16352]: I0307 21:43:13.665566 16352 generic.go:334] "Generic (PLEG): container finished" podID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerID="58b019180d18af9978ed984fa8ea3f3388b3fd37aa0ec168b38ef005be8346d2" exitCode=143 Mar 07 21:43:13.665673 master-0 kubenswrapper[16352]: I0307 21:43:13.665609 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerDied","Data":"d75b2a2e94bc7bcc078b90db1a505b9b3425739b0825ebdd8b77bcc6009b212e"} Mar 07 21:43:13.665673 master-0 kubenswrapper[16352]: I0307 21:43:13.665630 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerDied","Data":"58b019180d18af9978ed984fa8ea3f3388b3fd37aa0ec168b38ef005be8346d2"} Mar 07 21:43:13.692799 master-0 kubenswrapper[16352]: I0307 21:43:13.687120 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-f49f69884-v8xz2" event={"ID":"8c877c04-56be-4df0-b751-4691351e9f5d","Type":"ContainerDied","Data":"1517a90c24294061c003125fcc14aa3418e09b97326300ca39ca2db7281f1f96"} Mar 07 21:43:13.692799 master-0 kubenswrapper[16352]: I0307 21:43:13.687219 16352 scope.go:117] "RemoveContainer" containerID="71178167e291da054ac2196c37ee9b7fc49465fe3da171034ec208c32399bdcf" Mar 07 21:43:13.692799 master-0 kubenswrapper[16352]: I0307 21:43:13.687445 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-f49f69884-v8xz2" Mar 07 21:43:13.710450 master-0 kubenswrapper[16352]: I0307 21:43:13.708386 16352 generic.go:334] "Generic (PLEG): container finished" podID="f623599e-9cea-49ec-a621-f676a75574f9" containerID="2ce34483211f1234aa598d6b4877465bf2b3da6001d0f92033be8f259c68d8a2" exitCode=0 Mar 07 21:43:13.710450 master-0 kubenswrapper[16352]: I0307 21:43:13.708439 16352 generic.go:334] "Generic (PLEG): container finished" podID="f623599e-9cea-49ec-a621-f676a75574f9" containerID="919ca7a10f7d2057908b264c89ded111932385985e0c2acc946a7e48c2af77dc" exitCode=143 Mar 07 21:43:13.710450 master-0 kubenswrapper[16352]: I0307 21:43:13.708478 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerDied","Data":"2ce34483211f1234aa598d6b4877465bf2b3da6001d0f92033be8f259c68d8a2"} Mar 07 21:43:13.710450 master-0 kubenswrapper[16352]: I0307 21:43:13.708519 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerDied","Data":"919ca7a10f7d2057908b264c89ded111932385985e0c2acc946a7e48c2af77dc"} Mar 07 21:43:13.717108 master-0 kubenswrapper[16352]: I0307 21:43:13.715207 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.154651285 podStartE2EDuration="23.715183755s" podCreationTimestamp="2026-03-07 21:42:50 +0000 UTC" firstStartedPulling="2026-03-07 21:42:52.40847279 +0000 UTC m=+1495.479177849" lastFinishedPulling="2026-03-07 21:43:12.96900526 +0000 UTC m=+1516.039710319" observedRunningTime="2026-03-07 21:43:13.711265221 +0000 UTC m=+1516.781970290" watchObservedRunningTime="2026-03-07 21:43:13.715183755 +0000 UTC m=+1516.785888824" Mar 07 21:43:13.851725 master-0 kubenswrapper[16352]: I0307 21:43:13.851112 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:13.880719 master-0 kubenswrapper[16352]: I0307 21:43:13.880334 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:43:13.904057 master-0 kubenswrapper[16352]: I0307 21:43:13.889452 16352 scope.go:117] "RemoveContainer" containerID="fa9fa60fa174582ce2847403b83e63de271e277c86bae70067df78a0f025b6ee" Mar 07 21:43:13.904057 master-0 kubenswrapper[16352]: I0307 21:43:13.896455 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-f49f69884-v8xz2"] Mar 07 21:43:14.022728 master-0 kubenswrapper[16352]: I0307 21:43:14.022636 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.023446 master-0 kubenswrapper[16352]: I0307 21:43:14.023430 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.023950 master-0 kubenswrapper[16352]: I0307 21:43:14.023930 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.024137 master-0 kubenswrapper[16352]: I0307 21:43:14.024105 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.024365 master-0 kubenswrapper[16352]: I0307 21:43:14.024351 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.025179 master-0 kubenswrapper[16352]: I0307 21:43:14.024600 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.025431 master-0 kubenswrapper[16352]: I0307 21:43:14.025401 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-644h9\" (UniqueName: \"kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.025755 master-0 kubenswrapper[16352]: I0307 21:43:14.025739 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs\") pod \"f623599e-9cea-49ec-a621-f676a75574f9\" (UID: \"f623599e-9cea-49ec-a621-f676a75574f9\") " Mar 07 21:43:14.037135 master-0 kubenswrapper[16352]: I0307 21:43:14.029721 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:14.037135 master-0 kubenswrapper[16352]: I0307 21:43:14.029871 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs" (OuterVolumeSpecName: "logs") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:14.037135 master-0 kubenswrapper[16352]: I0307 21:43:14.032127 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts" (OuterVolumeSpecName: "scripts") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.045022 master-0 kubenswrapper[16352]: I0307 21:43:14.040956 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9" (OuterVolumeSpecName: "kube-api-access-644h9") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "kube-api-access-644h9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:14.078181 master-0 kubenswrapper[16352]: I0307 21:43:14.070180 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2" (OuterVolumeSpecName: "glance") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 21:43:14.104708 master-0 kubenswrapper[16352]: I0307 21:43:14.104166 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.112058 master-0 kubenswrapper[16352]: I0307 21:43:14.112006 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data" (OuterVolumeSpecName: "config-data") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.131536 master-0 kubenswrapper[16352]: I0307 21:43:14.131476 16352 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") on node \"master-0\" " Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134093 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134147 16352 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134158 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134170 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-644h9\" (UniqueName: \"kubernetes.io/projected/f623599e-9cea-49ec-a621-f676a75574f9-kube-api-access-644h9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134182 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.134236 master-0 kubenswrapper[16352]: I0307 21:43:14.134191 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f623599e-9cea-49ec-a621-f676a75574f9-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.203722 master-0 kubenswrapper[16352]: I0307 21:43:14.203569 16352 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 07 21:43:14.221756 master-0 kubenswrapper[16352]: I0307 21:43:14.221672 16352 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3" (UniqueName: "kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2") on node "master-0" Mar 07 21:43:14.238778 master-0 kubenswrapper[16352]: I0307 21:43:14.238615 16352 reconciler_common.go:293] "Volume detached for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.286103 master-0 kubenswrapper[16352]: I0307 21:43:14.285988 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "f623599e-9cea-49ec-a621-f676a75574f9" (UID: "f623599e-9cea-49ec-a621-f676a75574f9"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.327932 master-0 kubenswrapper[16352]: I0307 21:43:14.327877 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347151 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347584 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347757 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347796 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347928 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xrvv\" (UniqueName: \"kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347956 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348145 master-0 kubenswrapper[16352]: I0307 21:43:14.347997 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle\") pod \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\" (UID: \"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f\") " Mar 07 21:43:14.348812 master-0 kubenswrapper[16352]: I0307 21:43:14.348776 16352 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/f623599e-9cea-49ec-a621-f676a75574f9-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.349872 master-0 kubenswrapper[16352]: I0307 21:43:14.349828 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs" (OuterVolumeSpecName: "logs") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:14.353775 master-0 kubenswrapper[16352]: I0307 21:43:14.353650 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv" (OuterVolumeSpecName: "kube-api-access-7xrvv") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "kube-api-access-7xrvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:14.358317 master-0 kubenswrapper[16352]: I0307 21:43:14.358252 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts" (OuterVolumeSpecName: "scripts") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.366965 master-0 kubenswrapper[16352]: I0307 21:43:14.363821 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-26xt5"] Mar 07 21:43:14.370718 master-0 kubenswrapper[16352]: W0307 21:43:14.367540 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod38ea555e_6a67_483f_96a2_9587104b0c38.slice/crio-7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89 WatchSource:0}: Error finding container 7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89: Status 404 returned error can't find the container with id 7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89 Mar 07 21:43:14.380502 master-0 kubenswrapper[16352]: I0307 21:43:14.380410 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-94ssk"] Mar 07 21:43:14.450811 master-0 kubenswrapper[16352]: I0307 21:43:14.450725 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.450811 master-0 kubenswrapper[16352]: I0307 21:43:14.450777 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7xrvv\" (UniqueName: \"kubernetes.io/projected/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-kube-api-access-7xrvv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.450811 master-0 kubenswrapper[16352]: I0307 21:43:14.450791 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.457384 master-0 kubenswrapper[16352]: I0307 21:43:14.457309 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data" (OuterVolumeSpecName: "config-data") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.503333 master-0 kubenswrapper[16352]: I0307 21:43:14.503266 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.513742 master-0 kubenswrapper[16352]: I0307 21:43:14.508924 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:14.560744 master-0 kubenswrapper[16352]: I0307 21:43:14.554453 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.560744 master-0 kubenswrapper[16352]: I0307 21:43:14.554521 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.561600 master-0 kubenswrapper[16352]: I0307 21:43:14.561355 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.648721 master-0 kubenswrapper[16352]: I0307 21:43:14.647943 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:43:14.659147 master-0 kubenswrapper[16352]: W0307 21:43:14.659061 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod27e6c72c_28fc_4783_a670_31fe4f9b98fe.slice/crio-1a11555c46d47147ea59dd5f39a02f1f954b6f92e0fccf4380cbe37bdd69e46e WatchSource:0}: Error finding container 1a11555c46d47147ea59dd5f39a02f1f954b6f92e0fccf4380cbe37bdd69e46e: Status 404 returned error can't find the container with id 1a11555c46d47147ea59dd5f39a02f1f954b6f92e0fccf4380cbe37bdd69e46e Mar 07 21:43:14.662804 master-0 kubenswrapper[16352]: I0307 21:43:14.661322 16352 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.683120 master-0 kubenswrapper[16352]: I0307 21:43:14.683037 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" (UID: "5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:14.730646 master-0 kubenswrapper[16352]: I0307 21:43:14.728858 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26xt5" event={"ID":"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb","Type":"ContainerStarted","Data":"0f7ccd9b7255d1f82a4e994599fcbae5a990d628bf555148310806c85f3742e8"} Mar 07 21:43:14.730646 master-0 kubenswrapper[16352]: I0307 21:43:14.728906 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26xt5" event={"ID":"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb","Type":"ContainerStarted","Data":"c3b555a3c95fbd9e6b453987f5fdf4412e97e07c7e91d57ec0ca3de5807b9ebc"} Mar 07 21:43:14.739795 master-0 kubenswrapper[16352]: I0307 21:43:14.739588 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-94ssk" event={"ID":"38ea555e-6a67-483f-96a2-9587104b0c38","Type":"ContainerStarted","Data":"6a89908eff7c42fec7d5f6d4895174aaf23c7087916ae432ee41bc16adbf7c84"} Mar 07 21:43:14.739795 master-0 kubenswrapper[16352]: I0307 21:43:14.739650 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-94ssk" event={"ID":"38ea555e-6a67-483f-96a2-9587104b0c38","Type":"ContainerStarted","Data":"7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89"} Mar 07 21:43:14.748573 master-0 kubenswrapper[16352]: I0307 21:43:14.748530 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-6cc7544794-vmcq4" event={"ID":"5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f","Type":"ContainerDied","Data":"a1cc218270e0db485436dae256ca4c7372913f7f349d674148a4b32c43907635"} Mar 07 21:43:14.748573 master-0 kubenswrapper[16352]: I0307 21:43:14.748579 16352 scope.go:117] "RemoveContainer" containerID="a039f4a32907524546ecdece0f12ff81c8e16de6f2c7d7dbc0e3d8831b933b8c" Mar 07 21:43:14.748948 master-0 kubenswrapper[16352]: I0307 21:43:14.748738 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-6cc7544794-vmcq4" Mar 07 21:43:14.764171 master-0 kubenswrapper[16352]: I0307 21:43:14.759590 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"f623599e-9cea-49ec-a621-f676a75574f9","Type":"ContainerDied","Data":"707ceeffb2d395e854b73ef5f636b608a948fcf872762451c33b86b6d8dfdc22"} Mar 07 21:43:14.764171 master-0 kubenswrapper[16352]: I0307 21:43:14.759645 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:14.764171 master-0 kubenswrapper[16352]: I0307 21:43:14.762544 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c7d1db8d-1ad0-49f1-a993-d68d3587f595","Type":"ContainerStarted","Data":"af3e642aa2e6d89721b8a660371dca6db0ff7b3076aab0da53dec183dba18102"} Mar 07 21:43:14.764171 master-0 kubenswrapper[16352]: I0307 21:43:14.763728 16352 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:14.823694 master-0 kubenswrapper[16352]: I0307 21:43:14.820947 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"ef75ed0336d010624a52fb1e393805126085a9fe8964f5b60ac3f4ea3faae039"} Mar 07 21:43:14.831110 master-0 kubenswrapper[16352]: I0307 21:43:14.830978 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-26xt5" podStartSLOduration=15.830941828 podStartE2EDuration="15.830941828s" podCreationTimestamp="2026-03-07 21:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:14.756209413 +0000 UTC m=+1517.826914492" watchObservedRunningTime="2026-03-07 21:43:14.830941828 +0000 UTC m=+1517.901646887" Mar 07 21:43:14.871020 master-0 kubenswrapper[16352]: I0307 21:43:14.870907 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-db-create-94ssk" podStartSLOduration=15.870881338 podStartE2EDuration="15.870881338s" podCreationTimestamp="2026-03-07 21:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:14.781140171 +0000 UTC m=+1517.851845230" watchObservedRunningTime="2026-03-07 21:43:14.870881338 +0000 UTC m=+1517.941586407" Mar 07 21:43:14.887025 master-0 kubenswrapper[16352]: W0307 21:43:14.886177 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b487d67_175f_402c_883f_a4001fd9160c.slice/crio-e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678 WatchSource:0}: Error finding container e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678: Status 404 returned error can't find the container with id e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678 Mar 07 21:43:14.889075 master-0 kubenswrapper[16352]: W0307 21:43:14.889013 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2130caf7_6f24_4cb7_a216_e60f2b951f4a.slice/crio-1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee WatchSource:0}: Error finding container 1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee: Status 404 returned error can't find the container with id 1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee Mar 07 21:43:14.914072 master-0 kubenswrapper[16352]: W0307 21:43:14.914009 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5793afc9_06c2_497b_ab66_92254e79e871.slice/crio-ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f WatchSource:0}: Error finding container ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f: Status 404 returned error can't find the container with id ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f Mar 07 21:43:14.915494 master-0 kubenswrapper[16352]: I0307 21:43:14.915389 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" event={"ID":"55b7e31a-1da5-4528-b904-db7de86e1f26","Type":"ContainerStarted","Data":"04eac2071ffeec1ad3936b2cc98d437f772ab02b86de6d68a945b2307ec4bb0d"} Mar 07 21:43:14.915737 master-0 kubenswrapper[16352]: I0307 21:43:14.915709 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:43:14.922630 master-0 kubenswrapper[16352]: I0307 21:43:14.921724 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"05fd5d5c-a1d5-49d5-bd52-189f40a2dc43","Type":"ContainerStarted","Data":"11cf5277e331dbe1c58d1f8c1a8aad69c0830c3a37a1d76be80aa7395bbede0c"} Mar 07 21:43:14.926754 master-0 kubenswrapper[16352]: I0307 21:43:14.926693 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" event={"ID":"27e6c72c-28fc-4783-a670-31fe4f9b98fe","Type":"ContainerStarted","Data":"1a11555c46d47147ea59dd5f39a02f1f954b6f92e0fccf4380cbe37bdd69e46e"} Mar 07 21:43:14.943302 master-0 kubenswrapper[16352]: I0307 21:43:14.943106 16352 scope.go:117] "RemoveContainer" containerID="56401df6f757863c0a67a6d2486cdef43b3b8a5aea7f0f3263e07faea84a7d68" Mar 07 21:43:14.994697 master-0 kubenswrapper[16352]: I0307 21:43:14.994616 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-64285"] Mar 07 21:43:15.021725 master-0 kubenswrapper[16352]: I0307 21:43:15.021630 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-8a73-account-create-update-s57x2"] Mar 07 21:43:15.043058 master-0 kubenswrapper[16352]: I0307 21:43:15.042494 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-2b75-account-create-update-gqckp"] Mar 07 21:43:15.072538 master-0 kubenswrapper[16352]: I0307 21:43:15.072466 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-0300-account-create-update-b66m5"] Mar 07 21:43:15.160624 master-0 kubenswrapper[16352]: I0307 21:43:15.160045 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:15.296876 master-0 kubenswrapper[16352]: I0307 21:43:15.288436 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" path="/var/lib/kubelet/pods/8c877c04-56be-4df0-b751-4691351e9f5d/volumes" Mar 07 21:43:15.296876 master-0 kubenswrapper[16352]: I0307 21:43:15.290074 16352 scope.go:117] "RemoveContainer" containerID="2ce34483211f1234aa598d6b4877465bf2b3da6001d0f92033be8f259c68d8a2" Mar 07 21:43:15.302468 master-0 kubenswrapper[16352]: I0307 21:43:15.302396 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.302572 master-0 kubenswrapper[16352]: I0307 21:43:15.302496 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk9bt\" (UniqueName: \"kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.302572 master-0 kubenswrapper[16352]: I0307 21:43:15.302533 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.303269 master-0 kubenswrapper[16352]: I0307 21:43:15.303242 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.305943 master-0 kubenswrapper[16352]: I0307 21:43:15.305893 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.306039 master-0 kubenswrapper[16352]: I0307 21:43:15.306014 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.306103 master-0 kubenswrapper[16352]: I0307 21:43:15.306058 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.306159 master-0 kubenswrapper[16352]: I0307 21:43:15.306103 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.306511 master-0 kubenswrapper[16352]: I0307 21:43:15.306473 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:15.307044 master-0 kubenswrapper[16352]: I0307 21:43:15.306825 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs" (OuterVolumeSpecName: "logs") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:15.308034 master-0 kubenswrapper[16352]: I0307 21:43:15.307990 16352 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-httpd-run\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.308034 master-0 kubenswrapper[16352]: I0307 21:43:15.308029 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/34febdc7-58ae-4ec2-a8f3-92011ca01d81-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.340264 master-0 kubenswrapper[16352]: I0307 21:43:15.339806 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts" (OuterVolumeSpecName: "scripts") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:15.344479 master-0 kubenswrapper[16352]: I0307 21:43:15.344441 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt" (OuterVolumeSpecName: "kube-api-access-pk9bt") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "kube-api-access-pk9bt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:15.347544 master-0 kubenswrapper[16352]: I0307 21:43:15.347222 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e" (OuterVolumeSpecName: "glance") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "pvc-2828e4cd-2480-4309-bb23-a8e5342365ce". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 07 21:43:15.357266 master-0 kubenswrapper[16352]: I0307 21:43:15.356985 16352 scope.go:117] "RemoveContainer" containerID="919ca7a10f7d2057908b264c89ded111932385985e0c2acc946a7e48c2af77dc" Mar 07 21:43:15.389152 master-0 kubenswrapper[16352]: I0307 21:43:15.389097 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:15.413061 master-0 kubenswrapper[16352]: I0307 21:43:15.412964 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk9bt\" (UniqueName: \"kubernetes.io/projected/34febdc7-58ae-4ec2-a8f3-92011ca01d81-kube-api-access-pk9bt\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.413061 master-0 kubenswrapper[16352]: I0307 21:43:15.413050 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.413700 master-0 kubenswrapper[16352]: I0307 21:43:15.413124 16352 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") on node \"master-0\" " Mar 07 21:43:15.441173 master-0 kubenswrapper[16352]: I0307 21:43:15.441048 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:15.454442 master-0 kubenswrapper[16352]: I0307 21:43:15.454296 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.473802 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.474542 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.474562 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.474584 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.474592 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.474624 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.474658 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.474681 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.475467 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.475532 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.475542 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.475625 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.475633 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.475669 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.475676 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: E0307 21:43:15.475738 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.475746 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476504 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476535 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476562 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476579 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476596 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" containerName="placement-api" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476610 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" containerName="glance-httpd" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476629 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f623599e-9cea-49ec-a621-f676a75574f9" containerName="glance-log" Mar 07 21:43:15.477335 master-0 kubenswrapper[16352]: I0307 21:43:15.476647 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c877c04-56be-4df0-b751-4691351e9f5d" containerName="neutron-httpd" Mar 07 21:43:15.478666 master-0 kubenswrapper[16352]: I0307 21:43:15.478637 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.484020 master-0 kubenswrapper[16352]: I0307 21:43:15.483889 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-6cc7544794-vmcq4"] Mar 07 21:43:15.485472 master-0 kubenswrapper[16352]: I0307 21:43:15.485291 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-external-config-data" Mar 07 21:43:15.490544 master-0 kubenswrapper[16352]: I0307 21:43:15.490492 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 07 21:43:15.498045 master-0 kubenswrapper[16352]: I0307 21:43:15.497629 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:15.619429 master-0 kubenswrapper[16352]: I0307 21:43:15.619366 16352 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Mar 07 21:43:15.619799 master-0 kubenswrapper[16352]: I0307 21:43:15.619740 16352 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2828e4cd-2480-4309-bb23-a8e5342365ce" (UniqueName: "kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e") on node "master-0" Mar 07 21:43:15.626907 master-0 kubenswrapper[16352]: I0307 21:43:15.626821 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.627059 master-0 kubenswrapper[16352]: I0307 21:43:15.626938 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.627059 master-0 kubenswrapper[16352]: I0307 21:43:15.627043 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.627200 master-0 kubenswrapper[16352]: I0307 21:43:15.627152 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.628037 master-0 kubenswrapper[16352]: I0307 21:43:15.627474 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.628037 master-0 kubenswrapper[16352]: I0307 21:43:15.627557 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wsr6\" (UniqueName: \"kubernetes.io/projected/3dfded6e-7754-4d42-96ab-1bab69176899-kube-api-access-7wsr6\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.628697 master-0 kubenswrapper[16352]: I0307 21:43:15.628621 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.628779 master-0 kubenswrapper[16352]: I0307 21:43:15.628757 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.629320 master-0 kubenswrapper[16352]: I0307 21:43:15.629289 16352 reconciler_common.go:293] "Volume detached for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.691279 master-0 kubenswrapper[16352]: I0307 21:43:15.691176 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:15.730795 master-0 kubenswrapper[16352]: I0307 21:43:15.729998 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data" (OuterVolumeSpecName: "config-data") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:15.731629 master-0 kubenswrapper[16352]: I0307 21:43:15.731555 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") pod \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\" (UID: \"34febdc7-58ae-4ec2-a8f3-92011ca01d81\") " Mar 07 21:43:15.732537 master-0 kubenswrapper[16352]: I0307 21:43:15.732490 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.732601 master-0 kubenswrapper[16352]: I0307 21:43:15.732548 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.732601 master-0 kubenswrapper[16352]: I0307 21:43:15.732590 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.732714 master-0 kubenswrapper[16352]: I0307 21:43:15.732660 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.732862 master-0 kubenswrapper[16352]: I0307 21:43:15.732826 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.732907 master-0 kubenswrapper[16352]: I0307 21:43:15.732885 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wsr6\" (UniqueName: \"kubernetes.io/projected/3dfded6e-7754-4d42-96ab-1bab69176899-kube-api-access-7wsr6\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.733009 master-0 kubenswrapper[16352]: I0307 21:43:15.732977 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.733056 master-0 kubenswrapper[16352]: I0307 21:43:15.733023 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.733379 master-0 kubenswrapper[16352]: W0307 21:43:15.733341 16352 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/34febdc7-58ae-4ec2-a8f3-92011ca01d81/volumes/kubernetes.io~secret/config-data Mar 07 21:43:15.733379 master-0 kubenswrapper[16352]: I0307 21:43:15.733372 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data" (OuterVolumeSpecName: "config-data") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:15.737425 master-0 kubenswrapper[16352]: I0307 21:43:15.734983 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.738298 master-0 kubenswrapper[16352]: I0307 21:43:15.738215 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-logs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.739387 master-0 kubenswrapper[16352]: I0307 21:43:15.738539 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3dfded6e-7754-4d42-96ab-1bab69176899-httpd-run\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.741990 master-0 kubenswrapper[16352]: I0307 21:43:15.741944 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:43:15.742067 master-0 kubenswrapper[16352]: I0307 21:43:15.741999 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/a513f999ca477ba7af8fd57b1445c957c7136c73e46ac94a843087871d1d0d27/globalmount\"" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.742125 master-0 kubenswrapper[16352]: I0307 21:43:15.742000 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-combined-ca-bundle\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.745248 master-0 kubenswrapper[16352]: I0307 21:43:15.745194 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-public-tls-certs\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.745714 master-0 kubenswrapper[16352]: I0307 21:43:15.745641 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-scripts\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.745714 master-0 kubenswrapper[16352]: I0307 21:43:15.745609 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3dfded6e-7754-4d42-96ab-1bab69176899-config-data\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.749870 master-0 kubenswrapper[16352]: I0307 21:43:15.749811 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "34febdc7-58ae-4ec2-a8f3-92011ca01d81" (UID: "34febdc7-58ae-4ec2-a8f3-92011ca01d81"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:15.770351 master-0 kubenswrapper[16352]: I0307 21:43:15.770290 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wsr6\" (UniqueName: \"kubernetes.io/projected/3dfded6e-7754-4d42-96ab-1bab69176899-kube-api-access-7wsr6\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:15.839263 master-0 kubenswrapper[16352]: I0307 21:43:15.838372 16352 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.839263 master-0 kubenswrapper[16352]: I0307 21:43:15.838430 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/34febdc7-58ae-4ec2-a8f3-92011ca01d81-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:15.947212 master-0 kubenswrapper[16352]: I0307 21:43:15.947141 16352 generic.go:334] "Generic (PLEG): container finished" podID="38ea555e-6a67-483f-96a2-9587104b0c38" containerID="6a89908eff7c42fec7d5f6d4895174aaf23c7087916ae432ee41bc16adbf7c84" exitCode=0 Mar 07 21:43:15.947343 master-0 kubenswrapper[16352]: I0307 21:43:15.947236 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-94ssk" event={"ID":"38ea555e-6a67-483f-96a2-9587104b0c38","Type":"ContainerDied","Data":"6a89908eff7c42fec7d5f6d4895174aaf23c7087916ae432ee41bc16adbf7c84"} Mar 07 21:43:15.953457 master-0 kubenswrapper[16352]: I0307 21:43:15.953390 16352 generic.go:334] "Generic (PLEG): container finished" podID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerID="87863a99040258e3132a9d56d477a96b22234d6ff00c03d87ba512e02820601c" exitCode=0 Mar 07 21:43:15.953575 master-0 kubenswrapper[16352]: I0307 21:43:15.953497 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" event={"ID":"27e6c72c-28fc-4783-a670-31fe4f9b98fe","Type":"ContainerDied","Data":"87863a99040258e3132a9d56d477a96b22234d6ff00c03d87ba512e02820601c"} Mar 07 21:43:15.955897 master-0 kubenswrapper[16352]: I0307 21:43:15.955793 16352 generic.go:334] "Generic (PLEG): container finished" podID="5793afc9-06c2-497b-ab66-92254e79e871" containerID="a57de8bad7fe7a4bc9a511bfc6a99d1c0b27b61a43c65d2d32cc3ef5921f3cb5" exitCode=0 Mar 07 21:43:15.956084 master-0 kubenswrapper[16352]: I0307 21:43:15.956037 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" event={"ID":"5793afc9-06c2-497b-ab66-92254e79e871","Type":"ContainerDied","Data":"a57de8bad7fe7a4bc9a511bfc6a99d1c0b27b61a43c65d2d32cc3ef5921f3cb5"} Mar 07 21:43:15.956155 master-0 kubenswrapper[16352]: I0307 21:43:15.956086 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" event={"ID":"5793afc9-06c2-497b-ab66-92254e79e871","Type":"ContainerStarted","Data":"ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f"} Mar 07 21:43:15.965158 master-0 kubenswrapper[16352]: I0307 21:43:15.965076 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:15.965332 master-0 kubenswrapper[16352]: I0307 21:43:15.965067 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"34febdc7-58ae-4ec2-a8f3-92011ca01d81","Type":"ContainerDied","Data":"53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf"} Mar 07 21:43:15.965332 master-0 kubenswrapper[16352]: I0307 21:43:15.965284 16352 scope.go:117] "RemoveContainer" containerID="d75b2a2e94bc7bcc078b90db1a505b9b3425739b0825ebdd8b77bcc6009b212e" Mar 07 21:43:15.968400 master-0 kubenswrapper[16352]: I0307 21:43:15.968340 16352 generic.go:334] "Generic (PLEG): container finished" podID="c7d1db8d-1ad0-49f1-a993-d68d3587f595" containerID="a828ab689dd5fe3c8816df46e6b6da87969144726027132286e1ab5bd325023c" exitCode=0 Mar 07 21:43:15.968790 master-0 kubenswrapper[16352]: I0307 21:43:15.968745 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c7d1db8d-1ad0-49f1-a993-d68d3587f595","Type":"ContainerDied","Data":"a828ab689dd5fe3c8816df46e6b6da87969144726027132286e1ab5bd325023c"} Mar 07 21:43:15.987780 master-0 kubenswrapper[16352]: I0307 21:43:15.987702 16352 generic.go:334] "Generic (PLEG): container finished" podID="2130caf7-6f24-4cb7-a216-e60f2b951f4a" containerID="e632e2d9f4f74bb8b395559e6a9867df4987eed2140f492aa24bc6165f5f4701" exitCode=0 Mar 07 21:43:15.988081 master-0 kubenswrapper[16352]: I0307 21:43:15.987898 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-64285" event={"ID":"2130caf7-6f24-4cb7-a216-e60f2b951f4a","Type":"ContainerDied","Data":"e632e2d9f4f74bb8b395559e6a9867df4987eed2140f492aa24bc6165f5f4701"} Mar 07 21:43:15.988081 master-0 kubenswrapper[16352]: I0307 21:43:15.987956 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-64285" event={"ID":"2130caf7-6f24-4cb7-a216-e60f2b951f4a","Type":"ContainerStarted","Data":"1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee"} Mar 07 21:43:15.993749 master-0 kubenswrapper[16352]: I0307 21:43:15.992851 16352 generic.go:334] "Generic (PLEG): container finished" podID="0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" containerID="0f7ccd9b7255d1f82a4e994599fcbae5a990d628bf555148310806c85f3742e8" exitCode=0 Mar 07 21:43:15.993749 master-0 kubenswrapper[16352]: I0307 21:43:15.992952 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26xt5" event={"ID":"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb","Type":"ContainerDied","Data":"0f7ccd9b7255d1f82a4e994599fcbae5a990d628bf555148310806c85f3742e8"} Mar 07 21:43:15.995486 master-0 kubenswrapper[16352]: I0307 21:43:15.995443 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8a73-account-create-update-s57x2" event={"ID":"2b487d67-175f-402c-883f-a4001fd9160c","Type":"ContainerStarted","Data":"c549b034ce2d76254f58082bb77f95e290dc619e8499900094d1366d9585e905"} Mar 07 21:43:15.995545 master-0 kubenswrapper[16352]: I0307 21:43:15.995487 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8a73-account-create-update-s57x2" event={"ID":"2b487d67-175f-402c-883f-a4001fd9160c","Type":"ContainerStarted","Data":"e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678"} Mar 07 21:43:16.019022 master-0 kubenswrapper[16352]: I0307 21:43:16.018887 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0300-account-create-update-b66m5" event={"ID":"6bfd3214-eccf-402a-93de-6bd7f2cb2c08","Type":"ContainerStarted","Data":"15ecc592ea1265251f1a62e583ec349c3f5ed1c236d8ea5e0f294bff44d45ff6"} Mar 07 21:43:16.019022 master-0 kubenswrapper[16352]: I0307 21:43:16.018977 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0300-account-create-update-b66m5" event={"ID":"6bfd3214-eccf-402a-93de-6bd7f2cb2c08","Type":"ContainerStarted","Data":"b42426e70540d73d2267bd013342c2d645946a7deb2e060b5467ab4c06c7733c"} Mar 07 21:43:16.074243 master-0 kubenswrapper[16352]: I0307 21:43:16.074139 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-neutron-agent-89874fdc8-kjtzj" Mar 07 21:43:16.131658 master-0 kubenswrapper[16352]: I0307 21:43:16.131529 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-0300-account-create-update-b66m5" podStartSLOduration=17.13150369 podStartE2EDuration="17.13150369s" podCreationTimestamp="2026-03-07 21:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:16.12320227 +0000 UTC m=+1519.193907329" watchObservedRunningTime="2026-03-07 21:43:16.13150369 +0000 UTC m=+1519.202208749" Mar 07 21:43:16.208087 master-0 kubenswrapper[16352]: I0307 21:43:16.207964 16352 scope.go:117] "RemoveContainer" containerID="58b019180d18af9978ed984fa8ea3f3388b3fd37aa0ec168b38ef005be8346d2" Mar 07 21:43:16.296848 master-0 kubenswrapper[16352]: I0307 21:43:16.296402 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-8a73-account-create-update-s57x2" podStartSLOduration=17.29636856 podStartE2EDuration="17.29636856s" podCreationTimestamp="2026-03-07 21:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:16.201242225 +0000 UTC m=+1519.271947284" watchObservedRunningTime="2026-03-07 21:43:16.29636856 +0000 UTC m=+1519.367073619" Mar 07 21:43:16.330666 master-0 kubenswrapper[16352]: I0307 21:43:16.330456 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:16.356800 master-0 kubenswrapper[16352]: I0307 21:43:16.354881 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:16.394665 master-0 kubenswrapper[16352]: I0307 21:43:16.394594 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:16.397549 master-0 kubenswrapper[16352]: I0307 21:43:16.397253 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.402893 master-0 kubenswrapper[16352]: I0307 21:43:16.401181 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 07 21:43:16.402893 master-0 kubenswrapper[16352]: I0307 21:43:16.401829 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-213eb-default-internal-config-data" Mar 07 21:43:16.434387 master-0 kubenswrapper[16352]: I0307 21:43:16.434316 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:16.506159 master-0 kubenswrapper[16352]: I0307 21:43:16.506075 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.508940 master-0 kubenswrapper[16352]: I0307 21:43:16.508801 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509148 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509186 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509229 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75j5t\" (UniqueName: \"kubernetes.io/projected/563f0661-4099-45e9-815f-cad67b405fae-kube-api-access-75j5t\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509412 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509637 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.509872 master-0 kubenswrapper[16352]: I0307 21:43:16.509746 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.586087 master-0 kubenswrapper[16352]: I0307 21:43:16.586022 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:43:16.612346 master-0 kubenswrapper[16352]: I0307 21:43:16.612148 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612346 master-0 kubenswrapper[16352]: I0307 21:43:16.612306 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612392 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612477 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612510 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612593 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612635 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.612757 master-0 kubenswrapper[16352]: I0307 21:43:16.612670 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75j5t\" (UniqueName: \"kubernetes.io/projected/563f0661-4099-45e9-815f-cad67b405fae-kube-api-access-75j5t\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.614136 master-0 kubenswrapper[16352]: I0307 21:43:16.614079 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-httpd-run\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.614231 master-0 kubenswrapper[16352]: I0307 21:43:16.614161 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/563f0661-4099-45e9-815f-cad67b405fae-logs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.616106 master-0 kubenswrapper[16352]: E0307 21:43:16.615909 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34febdc7_58ae_4ec2_a8f3_92011ca01d81.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34febdc7_58ae_4ec2_a8f3_92011ca01d81.slice/crio-53434ff7cb569c3a68cf778605fd4dfa0469825f562a4a135f5c46c83a3910bf\": RecentStats: unable to find data in memory cache]" Mar 07 21:43:16.617276 master-0 kubenswrapper[16352]: I0307 21:43:16.617226 16352 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 07 21:43:16.617373 master-0 kubenswrapper[16352]: I0307 21:43:16.617283 16352 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/topolvm.io/26e6a054227d2ae645fb0f70048c6b35076d5abdd3e58247e88864732765f6e0/globalmount\"" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.618610 master-0 kubenswrapper[16352]: I0307 21:43:16.618575 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-internal-tls-certs\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.621159 master-0 kubenswrapper[16352]: I0307 21:43:16.621100 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-combined-ca-bundle\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.621382 master-0 kubenswrapper[16352]: I0307 21:43:16.621343 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-config-data\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.622088 master-0 kubenswrapper[16352]: I0307 21:43:16.621866 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/563f0661-4099-45e9-815f-cad67b405fae-scripts\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.639714 master-0 kubenswrapper[16352]: I0307 21:43:16.639634 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75j5t\" (UniqueName: \"kubernetes.io/projected/563f0661-4099-45e9-815f-cad67b405fae-kube-api-access-75j5t\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:16.706872 master-0 kubenswrapper[16352]: I0307 21:43:16.706068 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-4b3e46a7-10f2-435d-87b4-b6dd0a8c16d3\" (UniqueName: \"kubernetes.io/csi/topolvm.io^6afade83-95e7-42cb-bdc2-9f679e1a87c2\") pod \"glance-213eb-default-external-api-0\" (UID: \"3dfded6e-7754-4d42-96ab-1bab69176899\") " pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:16.714504 master-0 kubenswrapper[16352]: I0307 21:43:16.714432 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714631 master-0 kubenswrapper[16352]: I0307 21:43:16.714524 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-79qk4\" (UniqueName: \"kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714631 master-0 kubenswrapper[16352]: I0307 21:43:16.714608 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714840 master-0 kubenswrapper[16352]: I0307 21:43:16.714810 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714888 master-0 kubenswrapper[16352]: I0307 21:43:16.714872 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714932 master-0 kubenswrapper[16352]: I0307 21:43:16.714912 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.714998 master-0 kubenswrapper[16352]: I0307 21:43:16.714977 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\" (UID: \"c7d1db8d-1ad0-49f1-a993-d68d3587f595\") " Mar 07 21:43:16.716468 master-0 kubenswrapper[16352]: I0307 21:43:16.716414 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir" (OuterVolumeSpecName: "var-lib-ironic-inspector-dhcp-hostsdir") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "var-lib-ironic-inspector-dhcp-hostsdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:16.719598 master-0 kubenswrapper[16352]: I0307 21:43:16.719534 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic" (OuterVolumeSpecName: "var-lib-ironic") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "var-lib-ironic". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:43:16.719993 master-0 kubenswrapper[16352]: I0307 21:43:16.719931 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo" (OuterVolumeSpecName: "etc-podinfo") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "etc-podinfo". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 07 21:43:16.722052 master-0 kubenswrapper[16352]: I0307 21:43:16.722017 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config" (OuterVolumeSpecName: "config") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:16.722480 master-0 kubenswrapper[16352]: I0307 21:43:16.722445 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:16.725980 master-0 kubenswrapper[16352]: I0307 21:43:16.725931 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4" (OuterVolumeSpecName: "kube-api-access-79qk4") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "kube-api-access-79qk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:16.734815 master-0 kubenswrapper[16352]: I0307 21:43:16.734742 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts" (OuterVolumeSpecName: "scripts") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:16.819338 master-0 kubenswrapper[16352]: I0307 21:43:16.819280 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.819338 master-0 kubenswrapper[16352]: I0307 21:43:16.819332 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-79qk4\" (UniqueName: \"kubernetes.io/projected/c7d1db8d-1ad0-49f1-a993-d68d3587f595-kube-api-access-79qk4\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.819471 master-0 kubenswrapper[16352]: I0307 21:43:16.819345 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.819471 master-0 kubenswrapper[16352]: I0307 21:43:16.819356 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.819471 master-0 kubenswrapper[16352]: I0307 21:43:16.819368 16352 reconciler_common.go:293] "Volume detached for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/c7d1db8d-1ad0-49f1-a993-d68d3587f595-etc-podinfo\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.819471 master-0 kubenswrapper[16352]: I0307 21:43:16.819379 16352 reconciler_common.go:293] "Volume detached for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/c7d1db8d-1ad0-49f1-a993-d68d3587f595-var-lib-ironic-inspector-dhcp-hostsdir\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:16.887280 master-0 kubenswrapper[16352]: I0307 21:43:16.883977 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "c7d1db8d-1ad0-49f1-a993-d68d3587f595" (UID: "c7d1db8d-1ad0-49f1-a993-d68d3587f595"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:16.928614 master-0 kubenswrapper[16352]: I0307 21:43:16.928391 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c7d1db8d-1ad0-49f1-a993-d68d3587f595-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:17.090822 master-0 kubenswrapper[16352]: I0307 21:43:17.090674 16352 generic.go:334] "Generic (PLEG): container finished" podID="2b487d67-175f-402c-883f-a4001fd9160c" containerID="c549b034ce2d76254f58082bb77f95e290dc619e8499900094d1366d9585e905" exitCode=0 Mar 07 21:43:17.090822 master-0 kubenswrapper[16352]: I0307 21:43:17.090819 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8a73-account-create-update-s57x2" event={"ID":"2b487d67-175f-402c-883f-a4001fd9160c","Type":"ContainerDied","Data":"c549b034ce2d76254f58082bb77f95e290dc619e8499900094d1366d9585e905"} Mar 07 21:43:17.095929 master-0 kubenswrapper[16352]: I0307 21:43:17.095851 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" event={"ID":"27e6c72c-28fc-4783-a670-31fe4f9b98fe","Type":"ContainerStarted","Data":"1a77f04dda3772bff59116945cc0f533e1a0ed0929704e64ead5e1e2c1ad0583"} Mar 07 21:43:17.096254 master-0 kubenswrapper[16352]: I0307 21:43:17.096154 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:43:17.098036 master-0 kubenswrapper[16352]: I0307 21:43:17.097999 16352 generic.go:334] "Generic (PLEG): container finished" podID="6bfd3214-eccf-402a-93de-6bd7f2cb2c08" containerID="15ecc592ea1265251f1a62e583ec349c3f5ed1c236d8ea5e0f294bff44d45ff6" exitCode=0 Mar 07 21:43:17.098104 master-0 kubenswrapper[16352]: I0307 21:43:17.098064 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0300-account-create-update-b66m5" event={"ID":"6bfd3214-eccf-402a-93de-6bd7f2cb2c08","Type":"ContainerDied","Data":"15ecc592ea1265251f1a62e583ec349c3f5ed1c236d8ea5e0f294bff44d45ff6"} Mar 07 21:43:17.113780 master-0 kubenswrapper[16352]: I0307 21:43:17.111749 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:43:17.113780 master-0 kubenswrapper[16352]: I0307 21:43:17.113323 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"c7d1db8d-1ad0-49f1-a993-d68d3587f595","Type":"ContainerDied","Data":"af3e642aa2e6d89721b8a660371dca6db0ff7b3076aab0da53dec183dba18102"} Mar 07 21:43:17.113780 master-0 kubenswrapper[16352]: I0307 21:43:17.113417 16352 scope.go:117] "RemoveContainer" containerID="a828ab689dd5fe3c8816df46e6b6da87969144726027132286e1ab5bd325023c" Mar 07 21:43:17.258177 master-0 kubenswrapper[16352]: I0307 21:43:17.258035 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" podStartSLOduration=18.25757467 podStartE2EDuration="18.25757467s" podCreationTimestamp="2026-03-07 21:42:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:17.188400268 +0000 UTC m=+1520.259105327" watchObservedRunningTime="2026-03-07 21:43:17.25757467 +0000 UTC m=+1520.328279739" Mar 07 21:43:17.297708 master-0 kubenswrapper[16352]: I0307 21:43:17.296424 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="34febdc7-58ae-4ec2-a8f3-92011ca01d81" path="/var/lib/kubelet/pods/34febdc7-58ae-4ec2-a8f3-92011ca01d81/volumes" Mar 07 21:43:17.301710 master-0 kubenswrapper[16352]: I0307 21:43:17.298588 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f" path="/var/lib/kubelet/pods/5d0bdf10-6be4-4ceb-a9f6-cb879cbfd29f/volumes" Mar 07 21:43:17.301710 master-0 kubenswrapper[16352]: I0307 21:43:17.299946 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f623599e-9cea-49ec-a621-f676a75574f9" path="/var/lib/kubelet/pods/f623599e-9cea-49ec-a621-f676a75574f9/volumes" Mar 07 21:43:17.314498 master-0 kubenswrapper[16352]: I0307 21:43:17.312803 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:17.327710 master-0 kubenswrapper[16352]: I0307 21:43:17.327596 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:17.352152 master-0 kubenswrapper[16352]: I0307 21:43:17.351291 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:17.363205 master-0 kubenswrapper[16352]: E0307 21:43:17.363129 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c7d1db8d-1ad0-49f1-a993-d68d3587f595" containerName="ironic-python-agent-init" Mar 07 21:43:17.363205 master-0 kubenswrapper[16352]: I0307 21:43:17.363199 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="c7d1db8d-1ad0-49f1-a993-d68d3587f595" containerName="ironic-python-agent-init" Mar 07 21:43:17.363662 master-0 kubenswrapper[16352]: I0307 21:43:17.363635 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="c7d1db8d-1ad0-49f1-a993-d68d3587f595" containerName="ironic-python-agent-init" Mar 07 21:43:17.368967 master-0 kubenswrapper[16352]: I0307 21:43:17.367395 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:43:17.372189 master-0 kubenswrapper[16352]: I0307 21:43:17.372130 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:17.373702 master-0 kubenswrapper[16352]: I0307 21:43:17.373645 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-config-data" Mar 07 21:43:17.374135 master-0 kubenswrapper[16352]: I0307 21:43:17.373850 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-internal-svc" Mar 07 21:43:17.374135 master-0 kubenswrapper[16352]: I0307 21:43:17.373961 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ironic-inspector-public-svc" Mar 07 21:43:17.374135 master-0 kubenswrapper[16352]: I0307 21:43:17.374076 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-transport-url-ironic-inspector-transport" Mar 07 21:43:17.374274 master-0 kubenswrapper[16352]: I0307 21:43:17.374256 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ironic-inspector-scripts" Mar 07 21:43:17.568411 master-0 kubenswrapper[16352]: I0307 21:43:17.568127 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568650 master-0 kubenswrapper[16352]: I0307 21:43:17.568449 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-scripts\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568650 master-0 kubenswrapper[16352]: I0307 21:43:17.568529 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568650 master-0 kubenswrapper[16352]: I0307 21:43:17.568607 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568798 master-0 kubenswrapper[16352]: I0307 21:43:17.568713 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568798 master-0 kubenswrapper[16352]: I0307 21:43:17.568741 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9gbq\" (UniqueName: \"kubernetes.io/projected/6e092301-a354-4703-a3a0-a8e0656f11ca-kube-api-access-n9gbq\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.568926 master-0 kubenswrapper[16352]: I0307 21:43:17.568905 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.569251 master-0 kubenswrapper[16352]: I0307 21:43:17.569173 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e092301-a354-4703-a3a0-a8e0656f11ca-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.569327 master-0 kubenswrapper[16352]: I0307 21:43:17.569282 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-config\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.676732 master-0 kubenswrapper[16352]: I0307 21:43:17.676515 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.676732 master-0 kubenswrapper[16352]: I0307 21:43:17.676668 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e092301-a354-4703-a3a0-a8e0656f11ca-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677080 master-0 kubenswrapper[16352]: I0307 21:43:17.676740 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-config\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677080 master-0 kubenswrapper[16352]: I0307 21:43:17.676921 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677080 master-0 kubenswrapper[16352]: I0307 21:43:17.676975 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-scripts\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677080 master-0 kubenswrapper[16352]: I0307 21:43:17.677063 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677342 master-0 kubenswrapper[16352]: I0307 21:43:17.677181 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677539 master-0 kubenswrapper[16352]: I0307 21:43:17.677416 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.677639 master-0 kubenswrapper[16352]: I0307 21:43:17.677535 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9gbq\" (UniqueName: \"kubernetes.io/projected/6e092301-a354-4703-a3a0-a8e0656f11ca-kube-api-access-n9gbq\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.679382 master-0 kubenswrapper[16352]: I0307 21:43:17.678386 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2828e4cd-2480-4309-bb23-a8e5342365ce\" (UniqueName: \"kubernetes.io/csi/topolvm.io^db15846d-abb0-4379-a49f-2ae1229c819e\") pod \"glance-213eb-default-internal-api-0\" (UID: \"563f0661-4099-45e9-815f-cad67b405fae\") " pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:17.679382 master-0 kubenswrapper[16352]: I0307 21:43:17.678996 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic-inspector-dhcp-hostsdir\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic-inspector-dhcp-hostsdir\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.679382 master-0 kubenswrapper[16352]: I0307 21:43:17.679208 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-ironic\" (UniqueName: \"kubernetes.io/empty-dir/6e092301-a354-4703-a3a0-a8e0656f11ca-var-lib-ironic\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.688215 master-0 kubenswrapper[16352]: I0307 21:43:17.688126 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-scripts\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.696435 master-0 kubenswrapper[16352]: I0307 21:43:17.696367 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-internal-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.706917 master-0 kubenswrapper[16352]: I0307 21:43:17.698265 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-podinfo\" (UniqueName: \"kubernetes.io/downward-api/6e092301-a354-4703-a3a0-a8e0656f11ca-etc-podinfo\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.706917 master-0 kubenswrapper[16352]: I0307 21:43:17.701107 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-config\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.714106 master-0 kubenswrapper[16352]: I0307 21:43:17.708287 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-combined-ca-bundle\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.714106 master-0 kubenswrapper[16352]: I0307 21:43:17.708301 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/6e092301-a354-4703-a3a0-a8e0656f11ca-public-tls-certs\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.714106 master-0 kubenswrapper[16352]: I0307 21:43:17.709377 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9gbq\" (UniqueName: \"kubernetes.io/projected/6e092301-a354-4703-a3a0-a8e0656f11ca-kube-api-access-n9gbq\") pod \"ironic-inspector-0\" (UID: \"6e092301-a354-4703-a3a0-a8e0656f11ca\") " pod="openstack/ironic-inspector-0" Mar 07 21:43:17.720974 master-0 kubenswrapper[16352]: I0307 21:43:17.720334 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ironic-inspector-0" Mar 07 21:43:17.955662 master-0 kubenswrapper[16352]: I0307 21:43:17.955371 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:18.001461 master-0 kubenswrapper[16352]: I0307 21:43:18.001393 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:18.021562 master-0 kubenswrapper[16352]: I0307 21:43:18.021181 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-94ssk" Mar 07 21:43:18.103004 master-0 kubenswrapper[16352]: I0307 21:43:18.102811 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6jtvv\" (UniqueName: \"kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv\") pod \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " Mar 07 21:43:18.103279 master-0 kubenswrapper[16352]: I0307 21:43:18.103185 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts\") pod \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\" (UID: \"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb\") " Mar 07 21:43:18.104349 master-0 kubenswrapper[16352]: I0307 21:43:18.104317 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" (UID: "0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:18.111670 master-0 kubenswrapper[16352]: I0307 21:43:18.109512 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv" (OuterVolumeSpecName: "kube-api-access-6jtvv") pod "0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" (UID: "0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb"). InnerVolumeSpecName "kube-api-access-6jtvv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:18.138449 master-0 kubenswrapper[16352]: I0307 21:43:18.138338 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-26xt5" event={"ID":"0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb","Type":"ContainerDied","Data":"c3b555a3c95fbd9e6b453987f5fdf4412e97e07c7e91d57ec0ca3de5807b9ebc"} Mar 07 21:43:18.138449 master-0 kubenswrapper[16352]: I0307 21:43:18.138431 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3b555a3c95fbd9e6b453987f5fdf4412e97e07c7e91d57ec0ca3de5807b9ebc" Mar 07 21:43:18.138449 master-0 kubenswrapper[16352]: I0307 21:43:18.138391 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-26xt5" Mar 07 21:43:18.145785 master-0 kubenswrapper[16352]: I0307 21:43:18.143026 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-94ssk" event={"ID":"38ea555e-6a67-483f-96a2-9587104b0c38","Type":"ContainerDied","Data":"7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89"} Mar 07 21:43:18.145785 master-0 kubenswrapper[16352]: I0307 21:43:18.143083 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b0cf56f322faebd19a05059fe306d3309485cbe7cbf0b2829f2d1b2abd52a89" Mar 07 21:43:18.145785 master-0 kubenswrapper[16352]: I0307 21:43:18.143152 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-94ssk" Mar 07 21:43:18.205426 master-0 kubenswrapper[16352]: I0307 21:43:18.205154 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts\") pod \"38ea555e-6a67-483f-96a2-9587104b0c38\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " Mar 07 21:43:18.205426 master-0 kubenswrapper[16352]: I0307 21:43:18.205250 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sfstw\" (UniqueName: \"kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw\") pod \"38ea555e-6a67-483f-96a2-9587104b0c38\" (UID: \"38ea555e-6a67-483f-96a2-9587104b0c38\") " Mar 07 21:43:18.206913 master-0 kubenswrapper[16352]: I0307 21:43:18.206045 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "38ea555e-6a67-483f-96a2-9587104b0c38" (UID: "38ea555e-6a67-483f-96a2-9587104b0c38"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:18.206913 master-0 kubenswrapper[16352]: I0307 21:43:18.206407 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/38ea555e-6a67-483f-96a2-9587104b0c38-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.206913 master-0 kubenswrapper[16352]: I0307 21:43:18.206427 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.206913 master-0 kubenswrapper[16352]: I0307 21:43:18.206438 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6jtvv\" (UniqueName: \"kubernetes.io/projected/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb-kube-api-access-6jtvv\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.211244 master-0 kubenswrapper[16352]: I0307 21:43:18.211176 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw" (OuterVolumeSpecName: "kube-api-access-sfstw") pod "38ea555e-6a67-483f-96a2-9587104b0c38" (UID: "38ea555e-6a67-483f-96a2-9587104b0c38"). InnerVolumeSpecName "kube-api-access-sfstw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:18.218785 master-0 kubenswrapper[16352]: I0307 21:43:18.218659 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:18.298738 master-0 kubenswrapper[16352]: I0307 21:43:18.297082 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-64285" Mar 07 21:43:18.315063 master-0 kubenswrapper[16352]: I0307 21:43:18.311581 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sfstw\" (UniqueName: \"kubernetes.io/projected/38ea555e-6a67-483f-96a2-9587104b0c38-kube-api-access-sfstw\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.414372 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts\") pod \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.414422 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts\") pod \"5793afc9-06c2-497b-ab66-92254e79e871\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.414477 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkwl9\" (UniqueName: \"kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9\") pod \"5793afc9-06c2-497b-ab66-92254e79e871\" (UID: \"5793afc9-06c2-497b-ab66-92254e79e871\") " Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.414516 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ws6tk\" (UniqueName: \"kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk\") pod \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\" (UID: \"2130caf7-6f24-4cb7-a216-e60f2b951f4a\") " Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.415124 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2130caf7-6f24-4cb7-a216-e60f2b951f4a" (UID: "2130caf7-6f24-4cb7-a216-e60f2b951f4a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:18.422798 master-0 kubenswrapper[16352]: I0307 21:43:18.415713 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5793afc9-06c2-497b-ab66-92254e79e871" (UID: "5793afc9-06c2-497b-ab66-92254e79e871"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:18.430845 master-0 kubenswrapper[16352]: I0307 21:43:18.428769 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk" (OuterVolumeSpecName: "kube-api-access-ws6tk") pod "2130caf7-6f24-4cb7-a216-e60f2b951f4a" (UID: "2130caf7-6f24-4cb7-a216-e60f2b951f4a"). InnerVolumeSpecName "kube-api-access-ws6tk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:18.430845 master-0 kubenswrapper[16352]: I0307 21:43:18.428853 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9" (OuterVolumeSpecName: "kube-api-access-zkwl9") pod "5793afc9-06c2-497b-ab66-92254e79e871" (UID: "5793afc9-06c2-497b-ab66-92254e79e871"). InnerVolumeSpecName "kube-api-access-zkwl9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:18.470021 master-0 kubenswrapper[16352]: I0307 21:43:18.461060 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-external-api-0"] Mar 07 21:43:18.523079 master-0 kubenswrapper[16352]: I0307 21:43:18.522996 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2130caf7-6f24-4cb7-a216-e60f2b951f4a-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.524304 master-0 kubenswrapper[16352]: I0307 21:43:18.524244 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ironic-inspector-0"] Mar 07 21:43:18.524446 master-0 kubenswrapper[16352]: I0307 21:43:18.524337 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5793afc9-06c2-497b-ab66-92254e79e871-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.524537 master-0 kubenswrapper[16352]: I0307 21:43:18.524525 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkwl9\" (UniqueName: \"kubernetes.io/projected/5793afc9-06c2-497b-ab66-92254e79e871-kube-api-access-zkwl9\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.524602 master-0 kubenswrapper[16352]: I0307 21:43:18.524592 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ws6tk\" (UniqueName: \"kubernetes.io/projected/2130caf7-6f24-4cb7-a216-e60f2b951f4a-kube-api-access-ws6tk\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:18.603294 master-0 kubenswrapper[16352]: W0307 21:43:18.602108 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e092301_a354_4703_a3a0_a8e0656f11ca.slice/crio-fc8ec2f3f140d75a115be80869e97daf113f3b5c627a8cc6f5c1ae971d775167 WatchSource:0}: Error finding container fc8ec2f3f140d75a115be80869e97daf113f3b5c627a8cc6f5c1ae971d775167: Status 404 returned error can't find the container with id fc8ec2f3f140d75a115be80869e97daf113f3b5c627a8cc6f5c1ae971d775167 Mar 07 21:43:18.982501 master-0 kubenswrapper[16352]: I0307 21:43:18.981663 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:18.991650 master-0 kubenswrapper[16352]: I0307 21:43:18.991519 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:19.052457 master-0 kubenswrapper[16352]: I0307 21:43:19.052361 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sxs8f\" (UniqueName: \"kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f\") pod \"2b487d67-175f-402c-883f-a4001fd9160c\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " Mar 07 21:43:19.052794 master-0 kubenswrapper[16352]: I0307 21:43:19.052758 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts\") pod \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " Mar 07 21:43:19.052838 master-0 kubenswrapper[16352]: I0307 21:43:19.052816 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nvj5n\" (UniqueName: \"kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n\") pod \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\" (UID: \"6bfd3214-eccf-402a-93de-6bd7f2cb2c08\") " Mar 07 21:43:19.053035 master-0 kubenswrapper[16352]: I0307 21:43:19.053007 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts\") pod \"2b487d67-175f-402c-883f-a4001fd9160c\" (UID: \"2b487d67-175f-402c-883f-a4001fd9160c\") " Mar 07 21:43:19.053473 master-0 kubenswrapper[16352]: I0307 21:43:19.053403 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6bfd3214-eccf-402a-93de-6bd7f2cb2c08" (UID: "6bfd3214-eccf-402a-93de-6bd7f2cb2c08"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:19.053697 master-0 kubenswrapper[16352]: I0307 21:43:19.053645 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:19.054020 master-0 kubenswrapper[16352]: I0307 21:43:19.053992 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "2b487d67-175f-402c-883f-a4001fd9160c" (UID: "2b487d67-175f-402c-883f-a4001fd9160c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:19.090920 master-0 kubenswrapper[16352]: I0307 21:43:19.085363 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n" (OuterVolumeSpecName: "kube-api-access-nvj5n") pod "6bfd3214-eccf-402a-93de-6bd7f2cb2c08" (UID: "6bfd3214-eccf-402a-93de-6bd7f2cb2c08"). InnerVolumeSpecName "kube-api-access-nvj5n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:19.090920 master-0 kubenswrapper[16352]: I0307 21:43:19.085569 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f" (OuterVolumeSpecName: "kube-api-access-sxs8f") pod "2b487d67-175f-402c-883f-a4001fd9160c" (UID: "2b487d67-175f-402c-883f-a4001fd9160c"). InnerVolumeSpecName "kube-api-access-sxs8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:19.161612 master-0 kubenswrapper[16352]: I0307 21:43:19.158384 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nvj5n\" (UniqueName: \"kubernetes.io/projected/6bfd3214-eccf-402a-93de-6bd7f2cb2c08-kube-api-access-nvj5n\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:19.161612 master-0 kubenswrapper[16352]: I0307 21:43:19.158445 16352 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/2b487d67-175f-402c-883f-a4001fd9160c-operator-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:19.161612 master-0 kubenswrapper[16352]: I0307 21:43:19.158457 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sxs8f\" (UniqueName: \"kubernetes.io/projected/2b487d67-175f-402c-883f-a4001fd9160c-kube-api-access-sxs8f\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:19.181909 master-0 kubenswrapper[16352]: I0307 21:43:19.180926 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-213eb-default-internal-api-0"] Mar 07 21:43:19.210166 master-0 kubenswrapper[16352]: I0307 21:43:19.210080 16352 generic.go:334] "Generic (PLEG): container finished" podID="121505c3-5091-4945-a0aa-ec97b5f45ce5" containerID="ef75ed0336d010624a52fb1e393805126085a9fe8964f5b60ac3f4ea3faae039" exitCode=0 Mar 07 21:43:19.213365 master-0 kubenswrapper[16352]: I0307 21:43:19.213313 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-64285" Mar 07 21:43:19.213899 master-0 kubenswrapper[16352]: I0307 21:43:19.213836 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c7d1db8d-1ad0-49f1-a993-d68d3587f595" path="/var/lib/kubelet/pods/c7d1db8d-1ad0-49f1-a993-d68d3587f595/volumes" Mar 07 21:43:19.216356 master-0 kubenswrapper[16352]: I0307 21:43:19.215757 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerDied","Data":"ef75ed0336d010624a52fb1e393805126085a9fe8964f5b60ac3f4ea3faae039"} Mar 07 21:43:19.216356 master-0 kubenswrapper[16352]: I0307 21:43:19.215793 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-64285" event={"ID":"2130caf7-6f24-4cb7-a216-e60f2b951f4a","Type":"ContainerDied","Data":"1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee"} Mar 07 21:43:19.216356 master-0 kubenswrapper[16352]: I0307 21:43:19.215810 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1fb1396ee29687d5b9e218d97f02820f0e7a1fee01c7100e737d625958e7b6ee" Mar 07 21:43:19.216356 master-0 kubenswrapper[16352]: I0307 21:43:19.215821 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"563f0661-4099-45e9-815f-cad67b405fae","Type":"ContainerStarted","Data":"41efa9799f10e300ca2e0e57dbcd6228ea0df4917ef42d40efe5e4e9162a51d4"} Mar 07 21:43:19.224530 master-0 kubenswrapper[16352]: I0307 21:43:19.223836 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-8a73-account-create-update-s57x2" Mar 07 21:43:19.224530 master-0 kubenswrapper[16352]: I0307 21:43:19.223866 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-8a73-account-create-update-s57x2" event={"ID":"2b487d67-175f-402c-883f-a4001fd9160c","Type":"ContainerDied","Data":"e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678"} Mar 07 21:43:19.224530 master-0 kubenswrapper[16352]: I0307 21:43:19.223954 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e2d6d3864b1891ea00b3187160c0956a87d11453b086e70698c63c8ac6bc8678" Mar 07 21:43:19.231236 master-0 kubenswrapper[16352]: I0307 21:43:19.229766 16352 generic.go:334] "Generic (PLEG): container finished" podID="6e092301-a354-4703-a3a0-a8e0656f11ca" containerID="2f036b3428ba26217382abc050c2808c9b391a9698d2bb33c19a28f3e943f554" exitCode=0 Mar 07 21:43:19.231236 master-0 kubenswrapper[16352]: I0307 21:43:19.229861 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerDied","Data":"2f036b3428ba26217382abc050c2808c9b391a9698d2bb33c19a28f3e943f554"} Mar 07 21:43:19.231236 master-0 kubenswrapper[16352]: I0307 21:43:19.229897 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"fc8ec2f3f140d75a115be80869e97daf113f3b5c627a8cc6f5c1ae971d775167"} Mar 07 21:43:19.235507 master-0 kubenswrapper[16352]: I0307 21:43:19.235445 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-0300-account-create-update-b66m5" event={"ID":"6bfd3214-eccf-402a-93de-6bd7f2cb2c08","Type":"ContainerDied","Data":"b42426e70540d73d2267bd013342c2d645946a7deb2e060b5467ab4c06c7733c"} Mar 07 21:43:19.235585 master-0 kubenswrapper[16352]: I0307 21:43:19.235513 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b42426e70540d73d2267bd013342c2d645946a7deb2e060b5467ab4c06c7733c" Mar 07 21:43:19.235585 master-0 kubenswrapper[16352]: I0307 21:43:19.235586 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-0300-account-create-update-b66m5" Mar 07 21:43:19.241624 master-0 kubenswrapper[16352]: I0307 21:43:19.240625 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"3dfded6e-7754-4d42-96ab-1bab69176899","Type":"ContainerStarted","Data":"32da7ad2f2feb00c7260e9ab136c1ea1a3a611f91025b9f490410d7db604502c"} Mar 07 21:43:19.241624 master-0 kubenswrapper[16352]: I0307 21:43:19.240730 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"3dfded6e-7754-4d42-96ab-1bab69176899","Type":"ContainerStarted","Data":"c83b6fc187c0da630a76fbcc43d6bc093a689befc4e58fafcf78a6b4b66db2f9"} Mar 07 21:43:19.253730 master-0 kubenswrapper[16352]: I0307 21:43:19.253628 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" event={"ID":"5793afc9-06c2-497b-ab66-92254e79e871","Type":"ContainerDied","Data":"ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f"} Mar 07 21:43:19.253901 master-0 kubenswrapper[16352]: I0307 21:43:19.253828 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ead7ade37095ad8dc4ef467d740f2e748b588b490b724bbe83b5cfc3b2b4b41f" Mar 07 21:43:19.254081 master-0 kubenswrapper[16352]: I0307 21:43:19.253846 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-2b75-account-create-update-gqckp" Mar 07 21:43:20.326520 master-0 kubenswrapper[16352]: I0307 21:43:20.326366 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-external-api-0" event={"ID":"3dfded6e-7754-4d42-96ab-1bab69176899","Type":"ContainerStarted","Data":"75e3cd63df2cb0fd4b7987c31c4aa76692a8d7a77efba0b1f61e204b3117096a"} Mar 07 21:43:20.345721 master-0 kubenswrapper[16352]: I0307 21:43:20.345601 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"563f0661-4099-45e9-815f-cad67b405fae","Type":"ContainerStarted","Data":"bf4748dc0db54a6a300cbda443072817e200f7e17482b9ac5908e2a4397fe217"} Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.380903 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xm4p"] Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381670 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5793afc9-06c2-497b-ab66-92254e79e871" containerName="mariadb-account-create-update" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381706 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="5793afc9-06c2-497b-ab66-92254e79e871" containerName="mariadb-account-create-update" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381735 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38ea555e-6a67-483f-96a2-9587104b0c38" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381742 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="38ea555e-6a67-483f-96a2-9587104b0c38" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381769 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6bfd3214-eccf-402a-93de-6bd7f2cb2c08" containerName="mariadb-account-create-update" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381777 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6bfd3214-eccf-402a-93de-6bd7f2cb2c08" containerName="mariadb-account-create-update" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381816 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381823 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381835 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2130caf7-6f24-4cb7-a216-e60f2b951f4a" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381842 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2130caf7-6f24-4cb7-a216-e60f2b951f4a" containerName="mariadb-database-create" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: E0307 21:43:20.381857 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b487d67-175f-402c-883f-a4001fd9160c" containerName="mariadb-account-create-update" Mar 07 21:43:20.381850 master-0 kubenswrapper[16352]: I0307 21:43:20.381863 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b487d67-175f-402c-883f-a4001fd9160c" containerName="mariadb-account-create-update" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382254 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="5793afc9-06c2-497b-ab66-92254e79e871" containerName="mariadb-account-create-update" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382282 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2130caf7-6f24-4cb7-a216-e60f2b951f4a" containerName="mariadb-database-create" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382299 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="6bfd3214-eccf-402a-93de-6bd7f2cb2c08" containerName="mariadb-account-create-update" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382316 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="38ea555e-6a67-483f-96a2-9587104b0c38" containerName="mariadb-database-create" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382334 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" containerName="mariadb-database-create" Mar 07 21:43:20.382455 master-0 kubenswrapper[16352]: I0307 21:43:20.382344 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b487d67-175f-402c-883f-a4001fd9160c" containerName="mariadb-account-create-update" Mar 07 21:43:20.387254 master-0 kubenswrapper[16352]: I0307 21:43:20.383341 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.387804 master-0 kubenswrapper[16352]: I0307 21:43:20.387775 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 07 21:43:20.388031 master-0 kubenswrapper[16352]: I0307 21:43:20.388010 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 07 21:43:20.400709 master-0 kubenswrapper[16352]: I0307 21:43:20.399215 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xm4p"] Mar 07 21:43:20.415559 master-0 kubenswrapper[16352]: I0307 21:43:20.415481 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-213eb-default-external-api-0" podStartSLOduration=5.415450798 podStartE2EDuration="5.415450798s" podCreationTimestamp="2026-03-07 21:43:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:20.365290603 +0000 UTC m=+1523.435995662" watchObservedRunningTime="2026-03-07 21:43:20.415450798 +0000 UTC m=+1523.486155857" Mar 07 21:43:20.512945 master-0 kubenswrapper[16352]: I0307 21:43:20.512884 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.513254 master-0 kubenswrapper[16352]: I0307 21:43:20.513236 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.513397 master-0 kubenswrapper[16352]: I0307 21:43:20.513374 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.513512 master-0 kubenswrapper[16352]: I0307 21:43:20.513496 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kw4vm\" (UniqueName: \"kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.618163 master-0 kubenswrapper[16352]: I0307 21:43:20.618082 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.618470 master-0 kubenswrapper[16352]: I0307 21:43:20.618331 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.618470 master-0 kubenswrapper[16352]: I0307 21:43:20.618378 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.618470 master-0 kubenswrapper[16352]: I0307 21:43:20.618450 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kw4vm\" (UniqueName: \"kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.624483 master-0 kubenswrapper[16352]: I0307 21:43:20.624415 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.629580 master-0 kubenswrapper[16352]: I0307 21:43:20.629516 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.630989 master-0 kubenswrapper[16352]: I0307 21:43:20.629895 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.641714 master-0 kubenswrapper[16352]: I0307 21:43:20.641158 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kw4vm\" (UniqueName: \"kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm\") pod \"nova-cell0-conductor-db-sync-9xm4p\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:20.723281 master-0 kubenswrapper[16352]: I0307 21:43:20.723159 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:21.216764 master-0 kubenswrapper[16352]: I0307 21:43:21.216667 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xm4p"] Mar 07 21:43:21.222825 master-0 kubenswrapper[16352]: W0307 21:43:21.222746 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddec069dd_1a94_4b25_95f1_1346f25cf204.slice/crio-9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390 WatchSource:0}: Error finding container 9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390: Status 404 returned error can't find the container with id 9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390 Mar 07 21:43:21.371293 master-0 kubenswrapper[16352]: I0307 21:43:21.371145 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" event={"ID":"dec069dd-1a94-4b25-95f1-1346f25cf204","Type":"ContainerStarted","Data":"9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390"} Mar 07 21:43:21.379570 master-0 kubenswrapper[16352]: I0307 21:43:21.379495 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-213eb-default-internal-api-0" event={"ID":"563f0661-4099-45e9-815f-cad67b405fae","Type":"ContainerStarted","Data":"3418a255ed3926441be602243de9034a6c2121258a4de56f3e1f5efe3c992c70"} Mar 07 21:43:21.426004 master-0 kubenswrapper[16352]: I0307 21:43:21.425890 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-213eb-default-internal-api-0" podStartSLOduration=5.425857269 podStartE2EDuration="5.425857269s" podCreationTimestamp="2026-03-07 21:43:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:21.40880161 +0000 UTC m=+1524.479506659" watchObservedRunningTime="2026-03-07 21:43:21.425857269 +0000 UTC m=+1524.496562328" Mar 07 21:43:24.444570 master-0 kubenswrapper[16352]: I0307 21:43:24.443349 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"03f8ba09e8c196f3b491487a7dd3bd9b188bb1c612c7b9364bb0a665b63ecfce"} Mar 07 21:43:24.447717 master-0 kubenswrapper[16352]: I0307 21:43:24.447642 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"60fa4e08275a99f5f5e0c22a4ccb1a69a1e5b7b36f16a0c570846f2e2d2cd1d5"} Mar 07 21:43:24.723170 master-0 kubenswrapper[16352]: I0307 21:43:24.723015 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:43:25.044710 master-0 kubenswrapper[16352]: I0307 21:43:25.029597 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:43:25.044710 master-0 kubenswrapper[16352]: I0307 21:43:25.030019 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="dnsmasq-dns" containerID="cri-o://f568bf1ba59818260d2de9d934dd50f972a9e11ea5a20a8ddc126c606888ab7a" gracePeriod=10 Mar 07 21:43:25.498991 master-0 kubenswrapper[16352]: I0307 21:43:25.498904 16352 generic.go:334] "Generic (PLEG): container finished" podID="6e092301-a354-4703-a3a0-a8e0656f11ca" containerID="60fa4e08275a99f5f5e0c22a4ccb1a69a1e5b7b36f16a0c570846f2e2d2cd1d5" exitCode=0 Mar 07 21:43:25.499612 master-0 kubenswrapper[16352]: I0307 21:43:25.499010 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerDied","Data":"60fa4e08275a99f5f5e0c22a4ccb1a69a1e5b7b36f16a0c570846f2e2d2cd1d5"} Mar 07 21:43:25.506230 master-0 kubenswrapper[16352]: I0307 21:43:25.506173 16352 generic.go:334] "Generic (PLEG): container finished" podID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerID="f568bf1ba59818260d2de9d934dd50f972a9e11ea5a20a8ddc126c606888ab7a" exitCode=0 Mar 07 21:43:25.506534 master-0 kubenswrapper[16352]: I0307 21:43:25.506278 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" event={"ID":"e33d7a05-baac-460b-9f72-133d1f7c7b07","Type":"ContainerDied","Data":"f568bf1ba59818260d2de9d934dd50f972a9e11ea5a20a8ddc126c606888ab7a"} Mar 07 21:43:25.726302 master-0 kubenswrapper[16352]: I0307 21:43:25.726219 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843276 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nj4x6\" (UniqueName: \"kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843457 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843562 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843711 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843779 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.846844 master-0 kubenswrapper[16352]: I0307 21:43:25.843800 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0\") pod \"e33d7a05-baac-460b-9f72-133d1f7c7b07\" (UID: \"e33d7a05-baac-460b-9f72-133d1f7c7b07\") " Mar 07 21:43:25.887316 master-0 kubenswrapper[16352]: I0307 21:43:25.877054 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6" (OuterVolumeSpecName: "kube-api-access-nj4x6") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "kube-api-access-nj4x6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:25.954890 master-0 kubenswrapper[16352]: I0307 21:43:25.948492 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nj4x6\" (UniqueName: \"kubernetes.io/projected/e33d7a05-baac-460b-9f72-133d1f7c7b07-kube-api-access-nj4x6\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.000826 master-0 kubenswrapper[16352]: I0307 21:43:25.987451 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:26.024343 master-0 kubenswrapper[16352]: I0307 21:43:26.012283 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:26.052379 master-0 kubenswrapper[16352]: I0307 21:43:26.052297 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.053655 master-0 kubenswrapper[16352]: I0307 21:43:26.053626 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.081470 master-0 kubenswrapper[16352]: I0307 21:43:26.079900 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:26.084711 master-0 kubenswrapper[16352]: I0307 21:43:26.084229 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config" (OuterVolumeSpecName: "config") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:26.122637 master-0 kubenswrapper[16352]: I0307 21:43:26.121510 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "e33d7a05-baac-460b-9f72-133d1f7c7b07" (UID: "e33d7a05-baac-460b-9f72-133d1f7c7b07"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:43:26.158997 master-0 kubenswrapper[16352]: I0307 21:43:26.158918 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.158997 master-0 kubenswrapper[16352]: I0307 21:43:26.158989 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.158997 master-0 kubenswrapper[16352]: I0307 21:43:26.159005 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/e33d7a05-baac-460b-9f72-133d1f7c7b07-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:26.523632 master-0 kubenswrapper[16352]: I0307 21:43:26.523505 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" event={"ID":"e33d7a05-baac-460b-9f72-133d1f7c7b07","Type":"ContainerDied","Data":"7bcbc5d61802b8b22d410678b665a9a2ce3134c36d214615bcf30d7449e860c9"} Mar 07 21:43:26.523632 master-0 kubenswrapper[16352]: I0307 21:43:26.523586 16352 scope.go:117] "RemoveContainer" containerID="f568bf1ba59818260d2de9d934dd50f972a9e11ea5a20a8ddc126c606888ab7a" Mar 07 21:43:26.524351 master-0 kubenswrapper[16352]: I0307 21:43:26.523714 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-699fc4cfdf-cmxnl" Mar 07 21:43:26.529704 master-0 kubenswrapper[16352]: I0307 21:43:26.527620 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"16de5aaebbb32b14a56480495e462ead4be31aaf147af7cb54f21386bb78fbf8"} Mar 07 21:43:26.574144 master-0 kubenswrapper[16352]: I0307 21:43:26.573124 16352 scope.go:117] "RemoveContainer" containerID="20c63d6d67f75faa42d31573f6b657b524e1a39f33655f7269b5e7d4ac65b806" Mar 07 21:43:26.648708 master-0 kubenswrapper[16352]: I0307 21:43:26.637169 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:43:26.656761 master-0 kubenswrapper[16352]: I0307 21:43:26.654926 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-699fc4cfdf-cmxnl"] Mar 07 21:43:26.723137 master-0 kubenswrapper[16352]: I0307 21:43:26.723050 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:26.723397 master-0 kubenswrapper[16352]: I0307 21:43:26.723250 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:26.791150 master-0 kubenswrapper[16352]: I0307 21:43:26.791075 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:26.815730 master-0 kubenswrapper[16352]: I0307 21:43:26.814097 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:27.220003 master-0 kubenswrapper[16352]: I0307 21:43:27.219766 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" path="/var/lib/kubelet/pods/e33d7a05-baac-460b-9f72-133d1f7c7b07/volumes" Mar 07 21:43:27.576610 master-0 kubenswrapper[16352]: I0307 21:43:27.576020 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"b1478f34a1a4b5297db5a3e0dbbb6c4ebb1f351ca218dfa717b080a59eee2335"} Mar 07 21:43:27.576610 master-0 kubenswrapper[16352]: I0307 21:43:27.576108 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"7e507b396d6e50dfff1c27262125a4cdcf0f60bc0fdcaa5f50a33a8913795e2b"} Mar 07 21:43:27.576610 master-0 kubenswrapper[16352]: I0307 21:43:27.576545 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:27.577369 master-0 kubenswrapper[16352]: I0307 21:43:27.577338 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:27.967025 master-0 kubenswrapper[16352]: I0307 21:43:27.966886 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:27.967025 master-0 kubenswrapper[16352]: I0307 21:43:27.966990 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:28.009910 master-0 kubenswrapper[16352]: I0307 21:43:28.006772 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:28.027041 master-0 kubenswrapper[16352]: I0307 21:43:28.023624 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:28.592533 master-0 kubenswrapper[16352]: I0307 21:43:28.592466 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:28.592533 master-0 kubenswrapper[16352]: I0307 21:43:28.592522 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:29.609723 master-0 kubenswrapper[16352]: I0307 21:43:29.609074 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:43:29.609723 master-0 kubenswrapper[16352]: I0307 21:43:29.609135 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:43:29.949873 master-0 kubenswrapper[16352]: I0307 21:43:29.949718 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:29.971035 master-0 kubenswrapper[16352]: I0307 21:43:29.970954 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-external-api-0" Mar 07 21:43:30.622583 master-0 kubenswrapper[16352]: I0307 21:43:30.621774 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:43:30.622583 master-0 kubenswrapper[16352]: I0307 21:43:30.621811 16352 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 07 21:43:30.664731 master-0 kubenswrapper[16352]: I0307 21:43:30.664102 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:30.713714 master-0 kubenswrapper[16352]: I0307 21:43:30.712849 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-213eb-default-internal-api-0" Mar 07 21:43:35.702005 master-0 kubenswrapper[16352]: I0307 21:43:35.701802 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"784efe2f59adf317ea26aee2fd66715dddf26307a12c71a7b8fead75d2fc8287"} Mar 07 21:43:35.702005 master-0 kubenswrapper[16352]: I0307 21:43:35.701903 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-inspector-0" event={"ID":"6e092301-a354-4703-a3a0-a8e0656f11ca","Type":"ContainerStarted","Data":"9db6f3cefb136da8b9767e515c584911fef3ed5c6be75311b391b47dfcc629d0"} Mar 07 21:43:35.703188 master-0 kubenswrapper[16352]: I0307 21:43:35.702283 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 07 21:43:35.707911 master-0 kubenswrapper[16352]: I0307 21:43:35.707796 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" event={"ID":"dec069dd-1a94-4b25-95f1-1346f25cf204","Type":"ContainerStarted","Data":"5996fd8a2d2dec6a64eb982884f39b5b612393a00023e1419365ca6955728216"} Mar 07 21:43:35.708104 master-0 kubenswrapper[16352]: I0307 21:43:35.708067 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 07 21:43:35.756559 master-0 kubenswrapper[16352]: I0307 21:43:35.756310 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-inspector-0" podStartSLOduration=14.249777498 podStartE2EDuration="18.756285002s" podCreationTimestamp="2026-03-07 21:43:17 +0000 UTC" firstStartedPulling="2026-03-07 21:43:19.232330787 +0000 UTC m=+1522.303035846" lastFinishedPulling="2026-03-07 21:43:23.738838291 +0000 UTC m=+1526.809543350" observedRunningTime="2026-03-07 21:43:35.752262255 +0000 UTC m=+1538.822967324" watchObservedRunningTime="2026-03-07 21:43:35.756285002 +0000 UTC m=+1538.826990061" Mar 07 21:43:35.865460 master-0 kubenswrapper[16352]: I0307 21:43:35.865284 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" podStartSLOduration=2.152687229 podStartE2EDuration="15.865206578s" podCreationTimestamp="2026-03-07 21:43:20 +0000 UTC" firstStartedPulling="2026-03-07 21:43:21.225823074 +0000 UTC m=+1524.296528133" lastFinishedPulling="2026-03-07 21:43:34.938342423 +0000 UTC m=+1538.009047482" observedRunningTime="2026-03-07 21:43:35.850081985 +0000 UTC m=+1538.920787054" watchObservedRunningTime="2026-03-07 21:43:35.865206578 +0000 UTC m=+1538.935911657" Mar 07 21:43:36.723670 master-0 kubenswrapper[16352]: I0307 21:43:36.723551 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.721077 master-0 kubenswrapper[16352]: I0307 21:43:37.720973 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.721378 master-0 kubenswrapper[16352]: I0307 21:43:37.721109 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.721378 master-0 kubenswrapper[16352]: I0307 21:43:37.721147 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.721378 master-0 kubenswrapper[16352]: I0307 21:43:37.721171 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.746839 master-0 kubenswrapper[16352]: I0307 21:43:37.746761 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 07 21:43:37.751267 master-0 kubenswrapper[16352]: I0307 21:43:37.751200 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-inspector-0" Mar 07 21:43:38.763167 master-0 kubenswrapper[16352]: I0307 21:43:38.763048 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 07 21:43:38.766149 master-0 kubenswrapper[16352]: I0307 21:43:38.766111 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 07 21:43:38.821121 master-0 kubenswrapper[16352]: I0307 21:43:38.821065 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-inspector-0" Mar 07 21:43:53.005213 master-0 kubenswrapper[16352]: I0307 21:43:53.005091 16352 generic.go:334] "Generic (PLEG): container finished" podID="dec069dd-1a94-4b25-95f1-1346f25cf204" containerID="5996fd8a2d2dec6a64eb982884f39b5b612393a00023e1419365ca6955728216" exitCode=0 Mar 07 21:43:53.005213 master-0 kubenswrapper[16352]: I0307 21:43:53.005194 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" event={"ID":"dec069dd-1a94-4b25-95f1-1346f25cf204","Type":"ContainerDied","Data":"5996fd8a2d2dec6a64eb982884f39b5b612393a00023e1419365ca6955728216"} Mar 07 21:43:54.625082 master-0 kubenswrapper[16352]: I0307 21:43:54.625002 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:54.744387 master-0 kubenswrapper[16352]: I0307 21:43:54.744022 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle\") pod \"dec069dd-1a94-4b25-95f1-1346f25cf204\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " Mar 07 21:43:54.744387 master-0 kubenswrapper[16352]: I0307 21:43:54.744337 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data\") pod \"dec069dd-1a94-4b25-95f1-1346f25cf204\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " Mar 07 21:43:54.744778 master-0 kubenswrapper[16352]: I0307 21:43:54.744477 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts\") pod \"dec069dd-1a94-4b25-95f1-1346f25cf204\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " Mar 07 21:43:54.744778 master-0 kubenswrapper[16352]: I0307 21:43:54.744658 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kw4vm\" (UniqueName: \"kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm\") pod \"dec069dd-1a94-4b25-95f1-1346f25cf204\" (UID: \"dec069dd-1a94-4b25-95f1-1346f25cf204\") " Mar 07 21:43:54.748101 master-0 kubenswrapper[16352]: I0307 21:43:54.748046 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts" (OuterVolumeSpecName: "scripts") pod "dec069dd-1a94-4b25-95f1-1346f25cf204" (UID: "dec069dd-1a94-4b25-95f1-1346f25cf204"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:54.749852 master-0 kubenswrapper[16352]: I0307 21:43:54.749768 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm" (OuterVolumeSpecName: "kube-api-access-kw4vm") pod "dec069dd-1a94-4b25-95f1-1346f25cf204" (UID: "dec069dd-1a94-4b25-95f1-1346f25cf204"). InnerVolumeSpecName "kube-api-access-kw4vm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:43:54.777349 master-0 kubenswrapper[16352]: I0307 21:43:54.776583 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data" (OuterVolumeSpecName: "config-data") pod "dec069dd-1a94-4b25-95f1-1346f25cf204" (UID: "dec069dd-1a94-4b25-95f1-1346f25cf204"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:54.795232 master-0 kubenswrapper[16352]: I0307 21:43:54.795157 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "dec069dd-1a94-4b25-95f1-1346f25cf204" (UID: "dec069dd-1a94-4b25-95f1-1346f25cf204"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:43:54.848739 master-0 kubenswrapper[16352]: I0307 21:43:54.848637 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kw4vm\" (UniqueName: \"kubernetes.io/projected/dec069dd-1a94-4b25-95f1-1346f25cf204-kube-api-access-kw4vm\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:54.848739 master-0 kubenswrapper[16352]: I0307 21:43:54.848740 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:54.848962 master-0 kubenswrapper[16352]: I0307 21:43:54.848763 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:54.848962 master-0 kubenswrapper[16352]: I0307 21:43:54.848787 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/dec069dd-1a94-4b25-95f1-1346f25cf204-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:43:55.046730 master-0 kubenswrapper[16352]: I0307 21:43:55.046523 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" event={"ID":"dec069dd-1a94-4b25-95f1-1346f25cf204","Type":"ContainerDied","Data":"9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390"} Mar 07 21:43:55.046730 master-0 kubenswrapper[16352]: I0307 21:43:55.046588 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-9xm4p" Mar 07 21:43:55.047039 master-0 kubenswrapper[16352]: I0307 21:43:55.046608 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e59d3236f6159c4c720ad5425a8caa4fbb16a5fb4b84ac56237d5274ed4c390" Mar 07 21:43:55.266388 master-0 kubenswrapper[16352]: I0307 21:43:55.266299 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 21:43:55.266830 master-0 kubenswrapper[16352]: E0307 21:43:55.266797 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="init" Mar 07 21:43:55.266830 master-0 kubenswrapper[16352]: I0307 21:43:55.266821 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="init" Mar 07 21:43:55.266954 master-0 kubenswrapper[16352]: E0307 21:43:55.266859 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dec069dd-1a94-4b25-95f1-1346f25cf204" containerName="nova-cell0-conductor-db-sync" Mar 07 21:43:55.266954 master-0 kubenswrapper[16352]: I0307 21:43:55.266867 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="dec069dd-1a94-4b25-95f1-1346f25cf204" containerName="nova-cell0-conductor-db-sync" Mar 07 21:43:55.266954 master-0 kubenswrapper[16352]: E0307 21:43:55.266905 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="dnsmasq-dns" Mar 07 21:43:55.266954 master-0 kubenswrapper[16352]: I0307 21:43:55.266912 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="dnsmasq-dns" Mar 07 21:43:55.267260 master-0 kubenswrapper[16352]: I0307 21:43:55.267229 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="dec069dd-1a94-4b25-95f1-1346f25cf204" containerName="nova-cell0-conductor-db-sync" Mar 07 21:43:55.267326 master-0 kubenswrapper[16352]: I0307 21:43:55.267262 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e33d7a05-baac-460b-9f72-133d1f7c7b07" containerName="dnsmasq-dns" Mar 07 21:43:55.268266 master-0 kubenswrapper[16352]: I0307 21:43:55.268141 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 21:43:55.268266 master-0 kubenswrapper[16352]: I0307 21:43:55.268231 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.287300 master-0 kubenswrapper[16352]: I0307 21:43:55.287183 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 07 21:43:55.387562 master-0 kubenswrapper[16352]: I0307 21:43:55.387461 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.387894 master-0 kubenswrapper[16352]: I0307 21:43:55.387587 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.387894 master-0 kubenswrapper[16352]: I0307 21:43:55.387800 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-skggr\" (UniqueName: \"kubernetes.io/projected/43d0eb36-a67e-48de-afb9-8fe1c782d848-kube-api-access-skggr\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.491642 master-0 kubenswrapper[16352]: I0307 21:43:55.491547 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-skggr\" (UniqueName: \"kubernetes.io/projected/43d0eb36-a67e-48de-afb9-8fe1c782d848-kube-api-access-skggr\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.491969 master-0 kubenswrapper[16352]: I0307 21:43:55.491931 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.492081 master-0 kubenswrapper[16352]: I0307 21:43:55.492048 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.496374 master-0 kubenswrapper[16352]: I0307 21:43:55.496308 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.502722 master-0 kubenswrapper[16352]: I0307 21:43:55.501975 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/43d0eb36-a67e-48de-afb9-8fe1c782d848-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.520193 master-0 kubenswrapper[16352]: I0307 21:43:55.520142 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-skggr\" (UniqueName: \"kubernetes.io/projected/43d0eb36-a67e-48de-afb9-8fe1c782d848-kube-api-access-skggr\") pod \"nova-cell0-conductor-0\" (UID: \"43d0eb36-a67e-48de-afb9-8fe1c782d848\") " pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:55.608536 master-0 kubenswrapper[16352]: I0307 21:43:55.608465 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:56.152857 master-0 kubenswrapper[16352]: W0307 21:43:56.152736 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod43d0eb36_a67e_48de_afb9_8fe1c782d848.slice/crio-0f2744507f31a505926899bbfcfdb5412e0c5d23b3a153c105042f6a742c6ebd WatchSource:0}: Error finding container 0f2744507f31a505926899bbfcfdb5412e0c5d23b3a153c105042f6a742c6ebd: Status 404 returned error can't find the container with id 0f2744507f31a505926899bbfcfdb5412e0c5d23b3a153c105042f6a742c6ebd Mar 07 21:43:56.155815 master-0 kubenswrapper[16352]: I0307 21:43:56.155745 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 07 21:43:57.083328 master-0 kubenswrapper[16352]: I0307 21:43:57.083241 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"43d0eb36-a67e-48de-afb9-8fe1c782d848","Type":"ContainerStarted","Data":"323044f705c1002d1975cf4840033a27c4777c3f1bdd7f2dc508247f0414d860"} Mar 07 21:43:57.083328 master-0 kubenswrapper[16352]: I0307 21:43:57.083322 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"43d0eb36-a67e-48de-afb9-8fe1c782d848","Type":"ContainerStarted","Data":"0f2744507f31a505926899bbfcfdb5412e0c5d23b3a153c105042f6a742c6ebd"} Mar 07 21:43:57.085668 master-0 kubenswrapper[16352]: I0307 21:43:57.085608 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 07 21:43:57.121149 master-0 kubenswrapper[16352]: I0307 21:43:57.120999 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.120974749 podStartE2EDuration="2.120974749s" podCreationTimestamp="2026-03-07 21:43:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:43:57.110548689 +0000 UTC m=+1560.181253748" watchObservedRunningTime="2026-03-07 21:43:57.120974749 +0000 UTC m=+1560.191679818" Mar 07 21:44:05.668412 master-0 kubenswrapper[16352]: I0307 21:44:05.668327 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 07 21:44:06.323900 master-0 kubenswrapper[16352]: I0307 21:44:06.323817 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-c5sqt"] Mar 07 21:44:06.326652 master-0 kubenswrapper[16352]: I0307 21:44:06.326596 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.335233 master-0 kubenswrapper[16352]: I0307 21:44:06.335136 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 07 21:44:06.335538 master-0 kubenswrapper[16352]: I0307 21:44:06.335466 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 07 21:44:06.364341 master-0 kubenswrapper[16352]: I0307 21:44:06.364063 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c5sqt"] Mar 07 21:44:06.429749 master-0 kubenswrapper[16352]: I0307 21:44:06.426356 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 07 21:44:06.435431 master-0 kubenswrapper[16352]: I0307 21:44:06.435111 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.440177 master-0 kubenswrapper[16352]: I0307 21:44:06.439293 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.440177 master-0 kubenswrapper[16352]: I0307 21:44:06.439369 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.440177 master-0 kubenswrapper[16352]: I0307 21:44:06.439487 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wcvgg\" (UniqueName: \"kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.440177 master-0 kubenswrapper[16352]: I0307 21:44:06.439924 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.458788 master-0 kubenswrapper[16352]: I0307 21:44:06.445968 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-ironic-compute-config-data" Mar 07 21:44:06.538390 master-0 kubenswrapper[16352]: I0307 21:44:06.538287 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 07 21:44:06.542715 master-0 kubenswrapper[16352]: I0307 21:44:06.542624 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544140 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sgcmt\" (UniqueName: \"kubernetes.io/projected/002ef748-e065-42d6-8fa2-11d819c5ce98-kube-api-access-sgcmt\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544243 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544283 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544316 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544362 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.545194 master-0 kubenswrapper[16352]: I0307 21:44:06.544429 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wcvgg\" (UniqueName: \"kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.547946 master-0 kubenswrapper[16352]: I0307 21:44:06.547510 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.554065 master-0 kubenswrapper[16352]: I0307 21:44:06.553576 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.554065 master-0 kubenswrapper[16352]: I0307 21:44:06.553977 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.566735 master-0 kubenswrapper[16352]: I0307 21:44:06.565033 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wcvgg\" (UniqueName: \"kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg\") pod \"nova-cell0-cell-mapping-c5sqt\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.627963 master-0 kubenswrapper[16352]: I0307 21:44:06.627876 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:06.630516 master-0 kubenswrapper[16352]: I0307 21:44:06.630483 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:06.639312 master-0 kubenswrapper[16352]: I0307 21:44:06.639241 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 21:44:06.641813 master-0 kubenswrapper[16352]: I0307 21:44:06.641356 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:06.648179 master-0 kubenswrapper[16352]: I0307 21:44:06.647170 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.648179 master-0 kubenswrapper[16352]: I0307 21:44:06.647290 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.648179 master-0 kubenswrapper[16352]: I0307 21:44:06.647475 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sgcmt\" (UniqueName: \"kubernetes.io/projected/002ef748-e065-42d6-8fa2-11d819c5ce98-kube-api-access-sgcmt\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.651603 master-0 kubenswrapper[16352]: I0307 21:44:06.651559 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-config-data\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.656736 master-0 kubenswrapper[16352]: I0307 21:44:06.655240 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/002ef748-e065-42d6-8fa2-11d819c5ce98-combined-ca-bundle\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.682358 master-0 kubenswrapper[16352]: I0307 21:44:06.679209 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sgcmt\" (UniqueName: \"kubernetes.io/projected/002ef748-e065-42d6-8fa2-11d819c5ce98-kube-api-access-sgcmt\") pod \"nova-cell1-compute-ironic-compute-0\" (UID: \"002ef748-e065-42d6-8fa2-11d819c5ce98\") " pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.713846 master-0 kubenswrapper[16352]: I0307 21:44:06.713775 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:06.751064 master-0 kubenswrapper[16352]: I0307 21:44:06.750672 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.751466 master-0 kubenswrapper[16352]: I0307 21:44:06.751443 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqp6w\" (UniqueName: \"kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.752468 master-0 kubenswrapper[16352]: I0307 21:44:06.752448 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.841790 master-0 kubenswrapper[16352]: I0307 21:44:06.836800 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.857889 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.857988 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqp6w\" (UniqueName: \"kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.858413 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.865582 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.872310 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:06.876728 master-0 kubenswrapper[16352]: I0307 21:44:06.874782 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.888798 master-0 kubenswrapper[16352]: I0307 21:44:06.888558 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:06.898867 master-0 kubenswrapper[16352]: I0307 21:44:06.893926 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 21:44:06.918533 master-0 kubenswrapper[16352]: I0307 21:44:06.908478 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:06.936233 master-0 kubenswrapper[16352]: I0307 21:44:06.926360 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:06.936233 master-0 kubenswrapper[16352]: I0307 21:44:06.928827 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:06.939896 master-0 kubenswrapper[16352]: I0307 21:44:06.937906 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 21:44:06.971432 master-0 kubenswrapper[16352]: I0307 21:44:06.967368 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:06.971579 master-0 kubenswrapper[16352]: I0307 21:44:06.971540 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqp6w\" (UniqueName: \"kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w\") pod \"nova-scheduler-0\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.972401 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.972556 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.972633 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.973331 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.973642 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.973769 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbfrw\" (UniqueName: \"kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.973822 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ldhd\" (UniqueName: \"kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:06.989238 master-0 kubenswrapper[16352]: I0307 21:44:06.973887 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.014785 master-0 kubenswrapper[16352]: I0307 21:44:06.994938 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:07.014785 master-0 kubenswrapper[16352]: I0307 21:44:06.997234 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.014785 master-0 kubenswrapper[16352]: I0307 21:44:07.011304 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.076839 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.076925 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077019 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077074 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077101 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbfrw\" (UniqueName: \"kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077129 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ldhd\" (UniqueName: \"kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077158 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077191 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077247 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077281 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ftx7\" (UniqueName: \"kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.077313 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.081082 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.081394 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.089347 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.094756 master-0 kubenswrapper[16352]: I0307 21:44:07.094101 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.119826 master-0 kubenswrapper[16352]: I0307 21:44:07.099755 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:07.132819 master-0 kubenswrapper[16352]: I0307 21:44:07.132059 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbfrw\" (UniqueName: \"kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.136698 master-0 kubenswrapper[16352]: I0307 21:44:07.133526 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " pod="openstack/nova-api-0" Mar 07 21:44:07.136698 master-0 kubenswrapper[16352]: I0307 21:44:07.133624 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.161796 master-0 kubenswrapper[16352]: I0307 21:44:07.137538 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ldhd\" (UniqueName: \"kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd\") pod \"nova-metadata-0\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " pod="openstack/nova-metadata-0" Mar 07 21:44:07.161796 master-0 kubenswrapper[16352]: I0307 21:44:07.154516 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:44:07.161796 master-0 kubenswrapper[16352]: I0307 21:44:07.156997 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.189048 master-0 kubenswrapper[16352]: I0307 21:44:07.187418 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.189048 master-0 kubenswrapper[16352]: I0307 21:44:07.187518 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.189048 master-0 kubenswrapper[16352]: I0307 21:44:07.187639 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ftx7\" (UniqueName: \"kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.212780 master-0 kubenswrapper[16352]: I0307 21:44:07.210382 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:44:07.232726 master-0 kubenswrapper[16352]: I0307 21:44:07.213631 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.232726 master-0 kubenswrapper[16352]: I0307 21:44:07.215099 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:07.311818 master-0 kubenswrapper[16352]: I0307 21:44:07.272512 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.311818 master-0 kubenswrapper[16352]: I0307 21:44:07.273957 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.334815 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ftx7\" (UniqueName: \"kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7\") pod \"nova-cell1-novncproxy-0\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.335843 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.336971 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mbx4\" (UniqueName: \"kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.337076 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.337128 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.337188 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.337226 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.352451 master-0 kubenswrapper[16352]: I0307 21:44:07.337274 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.357083 master-0 kubenswrapper[16352]: I0307 21:44:07.357020 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:07.445635 master-0 kubenswrapper[16352]: I0307 21:44:07.443388 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.445635 master-0 kubenswrapper[16352]: I0307 21:44:07.443476 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.445635 master-0 kubenswrapper[16352]: I0307 21:44:07.443533 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.445635 master-0 kubenswrapper[16352]: I0307 21:44:07.443620 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mbx4\" (UniqueName: \"kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.446896 master-0 kubenswrapper[16352]: I0307 21:44:07.446834 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.449802 master-0 kubenswrapper[16352]: I0307 21:44:07.447982 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.449802 master-0 kubenswrapper[16352]: I0307 21:44:07.447991 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.449802 master-0 kubenswrapper[16352]: I0307 21:44:07.448199 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.449802 master-0 kubenswrapper[16352]: I0307 21:44:07.448948 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.449802 master-0 kubenswrapper[16352]: I0307 21:44:07.449122 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.460722 master-0 kubenswrapper[16352]: I0307 21:44:07.457409 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.484490 master-0 kubenswrapper[16352]: I0307 21:44:07.484426 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mbx4\" (UniqueName: \"kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4\") pod \"dnsmasq-dns-8459745b77-pkh7k\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.489073 master-0 kubenswrapper[16352]: I0307 21:44:07.488998 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:07.565871 master-0 kubenswrapper[16352]: I0307 21:44:07.565814 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-c5sqt"] Mar 07 21:44:07.904535 master-0 kubenswrapper[16352]: I0307 21:44:07.897712 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:07.973968 master-0 kubenswrapper[16352]: I0307 21:44:07.973901 16352 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 21:44:07.974765 master-0 kubenswrapper[16352]: I0307 21:44:07.974723 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:08.095779 master-0 kubenswrapper[16352]: I0307 21:44:08.095721 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-compute-ironic-compute-0"] Mar 07 21:44:08.142602 master-0 kubenswrapper[16352]: I0307 21:44:08.142541 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2rz24"] Mar 07 21:44:08.145355 master-0 kubenswrapper[16352]: I0307 21:44:08.145331 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.148462 master-0 kubenswrapper[16352]: I0307 21:44:08.148401 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 07 21:44:08.148646 master-0 kubenswrapper[16352]: I0307 21:44:08.148592 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 07 21:44:08.165132 master-0 kubenswrapper[16352]: I0307 21:44:08.165068 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2rz24"] Mar 07 21:44:08.227511 master-0 kubenswrapper[16352]: I0307 21:44:08.227257 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.227511 master-0 kubenswrapper[16352]: I0307 21:44:08.227354 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-46lxz\" (UniqueName: \"kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.227511 master-0 kubenswrapper[16352]: I0307 21:44:08.227471 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.228180 master-0 kubenswrapper[16352]: I0307 21:44:08.228121 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.266191 master-0 kubenswrapper[16352]: I0307 21:44:08.263062 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:08.276846 master-0 kubenswrapper[16352]: W0307 21:44:08.276656 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbaf221a2_4d50_4677_b110_466e8d64e3a8.slice/crio-40b56398fa814da104fdf56b86b9603ba1174b63cb45154befeb070ac650c0b3 WatchSource:0}: Error finding container 40b56398fa814da104fdf56b86b9603ba1174b63cb45154befeb070ac650c0b3: Status 404 returned error can't find the container with id 40b56398fa814da104fdf56b86b9603ba1174b63cb45154befeb070ac650c0b3 Mar 07 21:44:08.330153 master-0 kubenswrapper[16352]: I0307 21:44:08.330043 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.330153 master-0 kubenswrapper[16352]: I0307 21:44:08.330139 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-46lxz\" (UniqueName: \"kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.330334 master-0 kubenswrapper[16352]: I0307 21:44:08.330259 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.330334 master-0 kubenswrapper[16352]: I0307 21:44:08.330306 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.334412 master-0 kubenswrapper[16352]: I0307 21:44:08.334337 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.336825 master-0 kubenswrapper[16352]: I0307 21:44:08.336771 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.344224 master-0 kubenswrapper[16352]: I0307 21:44:08.344178 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.361894 master-0 kubenswrapper[16352]: I0307 21:44:08.361843 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-46lxz\" (UniqueName: \"kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz\") pod \"nova-cell1-conductor-db-sync-2rz24\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.395758 master-0 kubenswrapper[16352]: I0307 21:44:08.395656 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c5sqt" event={"ID":"bd475e91-463a-4538-84af-ba2b678d7f06","Type":"ContainerStarted","Data":"43d9dc91614e24b8ae37ce9d5e437173479c23869edf0df37c3e4904686c5776"} Mar 07 21:44:08.395758 master-0 kubenswrapper[16352]: I0307 21:44:08.395752 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c5sqt" event={"ID":"bd475e91-463a-4538-84af-ba2b678d7f06","Type":"ContainerStarted","Data":"ae3119a195741ab2c81cdcb91396dc058d9ff9ce6cd2a4644470c3587c51748a"} Mar 07 21:44:08.397669 master-0 kubenswrapper[16352]: I0307 21:44:08.397602 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerStarted","Data":"40b56398fa814da104fdf56b86b9603ba1174b63cb45154befeb070ac650c0b3"} Mar 07 21:44:08.399185 master-0 kubenswrapper[16352]: I0307 21:44:08.399152 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"002ef748-e065-42d6-8fa2-11d819c5ce98","Type":"ContainerStarted","Data":"2d6f0bddadd0f35a604c6d6e1bf7a10f35c8abe9da42b7e827e4a450789dea88"} Mar 07 21:44:08.402517 master-0 kubenswrapper[16352]: I0307 21:44:08.400654 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e787124b-7250-4b3f-953e-b91655e82506","Type":"ContainerStarted","Data":"b1a2fbfda0113819e21e5a5cc73858da71238f92e353d0f9236aa4666f748de8"} Mar 07 21:44:08.406562 master-0 kubenswrapper[16352]: I0307 21:44:08.406505 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerStarted","Data":"6246c8f0b6e72aa9bd8ab3bda3a9e57af728b1841390aa73264a2f922b997c67"} Mar 07 21:44:08.448399 master-0 kubenswrapper[16352]: I0307 21:44:08.448278 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-c5sqt" podStartSLOduration=2.448250911 podStartE2EDuration="2.448250911s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:08.426782806 +0000 UTC m=+1571.497487865" watchObservedRunningTime="2026-03-07 21:44:08.448250911 +0000 UTC m=+1571.518955970" Mar 07 21:44:08.484216 master-0 kubenswrapper[16352]: I0307 21:44:08.480496 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:08.494053 master-0 kubenswrapper[16352]: I0307 21:44:08.493093 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:08.510747 master-0 kubenswrapper[16352]: I0307 21:44:08.510556 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:44:09.136182 master-0 kubenswrapper[16352]: I0307 21:44:09.133578 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2rz24"] Mar 07 21:44:09.440835 master-0 kubenswrapper[16352]: I0307 21:44:09.440662 16352 generic.go:334] "Generic (PLEG): container finished" podID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerID="c4d81c90e2f0bee9707d97ca52bcbf6a09a0e7806ca42779a4c6ba4ac79f3046" exitCode=0 Mar 07 21:44:09.440835 master-0 kubenswrapper[16352]: I0307 21:44:09.440767 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" event={"ID":"f4bc275d-a9ea-41ac-840a-f9954b05742c","Type":"ContainerDied","Data":"c4d81c90e2f0bee9707d97ca52bcbf6a09a0e7806ca42779a4c6ba4ac79f3046"} Mar 07 21:44:09.441124 master-0 kubenswrapper[16352]: I0307 21:44:09.440840 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" event={"ID":"f4bc275d-a9ea-41ac-840a-f9954b05742c","Type":"ContainerStarted","Data":"dec83259d2ac6258b0daf9530dda83637c1ab1ebf153a53483b971b6cc9d4672"} Mar 07 21:44:09.445119 master-0 kubenswrapper[16352]: I0307 21:44:09.445041 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fd23d6c-2193-4d30-90d9-c34092e4dc62","Type":"ContainerStarted","Data":"26cae7ecc153d8b4387ce4fad05918f0e85de6a1f4fb5372290728435e3b80d9"} Mar 07 21:44:09.694367 master-0 kubenswrapper[16352]: W0307 21:44:09.694166 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda726ded1_f768_48e1_87c1_1b99262d45e1.slice/crio-68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c WatchSource:0}: Error finding container 68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c: Status 404 returned error can't find the container with id 68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c Mar 07 21:44:10.476186 master-0 kubenswrapper[16352]: I0307 21:44:10.476094 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2rz24" event={"ID":"a726ded1-f768-48e1-87c1-1b99262d45e1","Type":"ContainerStarted","Data":"68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c"} Mar 07 21:44:10.787705 master-0 kubenswrapper[16352]: I0307 21:44:10.786402 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:10.822595 master-0 kubenswrapper[16352]: I0307 21:44:10.822500 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:11.492644 master-0 kubenswrapper[16352]: I0307 21:44:11.492571 16352 generic.go:334] "Generic (PLEG): container finished" podID="121505c3-5091-4945-a0aa-ec97b5f45ce5" containerID="03f8ba09e8c196f3b491487a7dd3bd9b188bb1c612c7b9364bb0a665b63ecfce" exitCode=0 Mar 07 21:44:11.493442 master-0 kubenswrapper[16352]: I0307 21:44:11.492667 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerDied","Data":"03f8ba09e8c196f3b491487a7dd3bd9b188bb1c612c7b9364bb0a665b63ecfce"} Mar 07 21:44:12.515050 master-0 kubenswrapper[16352]: I0307 21:44:12.514625 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fd23d6c-2193-4d30-90d9-c34092e4dc62","Type":"ContainerStarted","Data":"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b"} Mar 07 21:44:12.515050 master-0 kubenswrapper[16352]: I0307 21:44:12.514881 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b" gracePeriod=30 Mar 07 21:44:12.534150 master-0 kubenswrapper[16352]: I0307 21:44:12.534056 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerStarted","Data":"e2dedb2cb9585b01fbdec6d2c22662c5b76db2a1bf6d57f77c4d5b94eace2971"} Mar 07 21:44:12.534150 master-0 kubenswrapper[16352]: I0307 21:44:12.534132 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerStarted","Data":"79049fce9fb2a5ced5424220a56dd87c4ecfa50bb4de1709f5a077e0bf3916c8"} Mar 07 21:44:12.549100 master-0 kubenswrapper[16352]: I0307 21:44:12.549009 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"5081c09834f7cd64ddba2839728d483ed30b0c8641da69785553de3fb4a6b25b"} Mar 07 21:44:12.549958 master-0 kubenswrapper[16352]: I0307 21:44:12.549659 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.750716139 podStartE2EDuration="6.549631434s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="2026-03-07 21:44:08.538898109 +0000 UTC m=+1571.609603168" lastFinishedPulling="2026-03-07 21:44:11.337813414 +0000 UTC m=+1574.408518463" observedRunningTime="2026-03-07 21:44:12.54363851 +0000 UTC m=+1575.614343569" watchObservedRunningTime="2026-03-07 21:44:12.549631434 +0000 UTC m=+1575.620336503" Mar 07 21:44:12.560048 master-0 kubenswrapper[16352]: I0307 21:44:12.559986 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" event={"ID":"f4bc275d-a9ea-41ac-840a-f9954b05742c","Type":"ContainerStarted","Data":"4a83490f6a2c30c916a511e48f131f32ce8ba7893ec155ff77a68c927c30c797"} Mar 07 21:44:12.561300 master-0 kubenswrapper[16352]: I0307 21:44:12.561256 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:12.570118 master-0 kubenswrapper[16352]: I0307 21:44:12.569954 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e787124b-7250-4b3f-953e-b91655e82506","Type":"ContainerStarted","Data":"970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a"} Mar 07 21:44:12.577293 master-0 kubenswrapper[16352]: I0307 21:44:12.577062 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=3.543990563 podStartE2EDuration="6.572725059s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="2026-03-07 21:44:08.306494116 +0000 UTC m=+1571.377199175" lastFinishedPulling="2026-03-07 21:44:11.335228592 +0000 UTC m=+1574.405933671" observedRunningTime="2026-03-07 21:44:12.567195076 +0000 UTC m=+1575.637900135" watchObservedRunningTime="2026-03-07 21:44:12.572725059 +0000 UTC m=+1575.643430118" Mar 07 21:44:12.583956 master-0 kubenswrapper[16352]: I0307 21:44:12.582397 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerStarted","Data":"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256"} Mar 07 21:44:12.583956 master-0 kubenswrapper[16352]: I0307 21:44:12.582987 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerStarted","Data":"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750"} Mar 07 21:44:12.583956 master-0 kubenswrapper[16352]: I0307 21:44:12.583496 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-log" containerID="cri-o://b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" gracePeriod=30 Mar 07 21:44:12.583956 master-0 kubenswrapper[16352]: I0307 21:44:12.583898 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-metadata" containerID="cri-o://2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" gracePeriod=30 Mar 07 21:44:12.596989 master-0 kubenswrapper[16352]: I0307 21:44:12.595839 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2rz24" event={"ID":"a726ded1-f768-48e1-87c1-1b99262d45e1","Type":"ContainerStarted","Data":"bc13e8beb36012f16bde41d8bcda1e30055c34fd50a42f60875a5404f6660ecf"} Mar 07 21:44:12.608642 master-0 kubenswrapper[16352]: I0307 21:44:12.607781 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" podStartSLOduration=6.60774234 podStartE2EDuration="6.60774234s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:12.594996934 +0000 UTC m=+1575.665702003" watchObservedRunningTime="2026-03-07 21:44:12.60774234 +0000 UTC m=+1575.678447399" Mar 07 21:44:12.653214 master-0 kubenswrapper[16352]: I0307 21:44:12.653128 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.307547773 podStartE2EDuration="6.653098419s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="2026-03-07 21:44:07.989495601 +0000 UTC m=+1571.060200660" lastFinishedPulling="2026-03-07 21:44:11.335046247 +0000 UTC m=+1574.405751306" observedRunningTime="2026-03-07 21:44:12.615766403 +0000 UTC m=+1575.686471462" watchObservedRunningTime="2026-03-07 21:44:12.653098419 +0000 UTC m=+1575.723803478" Mar 07 21:44:12.656186 master-0 kubenswrapper[16352]: I0307 21:44:12.656138 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=3.296338494 podStartE2EDuration="6.656124532s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="2026-03-07 21:44:07.973851825 +0000 UTC m=+1571.044556884" lastFinishedPulling="2026-03-07 21:44:11.333637863 +0000 UTC m=+1574.404342922" observedRunningTime="2026-03-07 21:44:12.648449217 +0000 UTC m=+1575.719154276" watchObservedRunningTime="2026-03-07 21:44:12.656124532 +0000 UTC m=+1575.726829591" Mar 07 21:44:12.695969 master-0 kubenswrapper[16352]: I0307 21:44:12.695831 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-2rz24" podStartSLOduration=4.695759444 podStartE2EDuration="4.695759444s" podCreationTimestamp="2026-03-07 21:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:12.670382875 +0000 UTC m=+1575.741087924" watchObservedRunningTime="2026-03-07 21:44:12.695759444 +0000 UTC m=+1575.766464503" Mar 07 21:44:13.163071 master-0 kubenswrapper[16352]: I0307 21:44:13.162208 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:13.239351 master-0 kubenswrapper[16352]: I0307 21:44:13.239008 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data\") pod \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " Mar 07 21:44:13.239351 master-0 kubenswrapper[16352]: I0307 21:44:13.239186 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs\") pod \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " Mar 07 21:44:13.244327 master-0 kubenswrapper[16352]: I0307 21:44:13.244252 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs" (OuterVolumeSpecName: "logs") pod "afd90cad-b881-461f-8fa9-6e1d5925f6f6" (UID: "afd90cad-b881-461f-8fa9-6e1d5925f6f6"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:44:13.285785 master-0 kubenswrapper[16352]: I0307 21:44:13.269949 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle\") pod \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " Mar 07 21:44:13.285785 master-0 kubenswrapper[16352]: I0307 21:44:13.283291 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ldhd\" (UniqueName: \"kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd\") pod \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\" (UID: \"afd90cad-b881-461f-8fa9-6e1d5925f6f6\") " Mar 07 21:44:13.285785 master-0 kubenswrapper[16352]: I0307 21:44:13.279140 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data" (OuterVolumeSpecName: "config-data") pod "afd90cad-b881-461f-8fa9-6e1d5925f6f6" (UID: "afd90cad-b881-461f-8fa9-6e1d5925f6f6"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:13.286173 master-0 kubenswrapper[16352]: I0307 21:44:13.286135 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:13.286227 master-0 kubenswrapper[16352]: I0307 21:44:13.286176 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/afd90cad-b881-461f-8fa9-6e1d5925f6f6-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:13.288942 master-0 kubenswrapper[16352]: I0307 21:44:13.288861 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd" (OuterVolumeSpecName: "kube-api-access-7ldhd") pod "afd90cad-b881-461f-8fa9-6e1d5925f6f6" (UID: "afd90cad-b881-461f-8fa9-6e1d5925f6f6"). InnerVolumeSpecName "kube-api-access-7ldhd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:13.326940 master-0 kubenswrapper[16352]: I0307 21:44:13.326851 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "afd90cad-b881-461f-8fa9-6e1d5925f6f6" (UID: "afd90cad-b881-461f-8fa9-6e1d5925f6f6"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:13.389778 master-0 kubenswrapper[16352]: I0307 21:44:13.389647 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/afd90cad-b881-461f-8fa9-6e1d5925f6f6-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:13.389778 master-0 kubenswrapper[16352]: I0307 21:44:13.389739 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ldhd\" (UniqueName: \"kubernetes.io/projected/afd90cad-b881-461f-8fa9-6e1d5925f6f6-kube-api-access-7ldhd\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622515 16352 generic.go:334] "Generic (PLEG): container finished" podID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerID="2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" exitCode=0 Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622586 16352 generic.go:334] "Generic (PLEG): container finished" podID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerID="b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" exitCode=143 Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622743 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerDied","Data":"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256"} Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622788 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerDied","Data":"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750"} Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622809 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"afd90cad-b881-461f-8fa9-6e1d5925f6f6","Type":"ContainerDied","Data":"6246c8f0b6e72aa9bd8ab3bda3a9e57af728b1841390aa73264a2f922b997c67"} Mar 07 21:44:13.623280 master-0 kubenswrapper[16352]: I0307 21:44:13.622839 16352 scope.go:117] "RemoveContainer" containerID="2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" Mar 07 21:44:13.625738 master-0 kubenswrapper[16352]: I0307 21:44:13.623335 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:13.644668 master-0 kubenswrapper[16352]: I0307 21:44:13.644028 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"9ac63315339bdac6aa57b22e8f733cc8e56ee8de41719a2a78be4ab098fff2df"} Mar 07 21:44:13.644668 master-0 kubenswrapper[16352]: I0307 21:44:13.644110 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ironic-conductor-0" event={"ID":"121505c3-5091-4945-a0aa-ec97b5f45ce5","Type":"ContainerStarted","Data":"39b9016c8e80b712028533c1c430d2dbb0efda637230677a933583af7f8a03dc"} Mar 07 21:44:13.716441 master-0 kubenswrapper[16352]: I0307 21:44:13.712304 16352 scope.go:117] "RemoveContainer" containerID="b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" Mar 07 21:44:13.728843 master-0 kubenswrapper[16352]: I0307 21:44:13.723522 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ironic-conductor-0" podStartSLOduration=62.889170323 podStartE2EDuration="1m41.723471942s" podCreationTimestamp="2026-03-07 21:42:32 +0000 UTC" firstStartedPulling="2026-03-07 21:42:44.888282032 +0000 UTC m=+1487.958987091" lastFinishedPulling="2026-03-07 21:43:23.722583661 +0000 UTC m=+1526.793288710" observedRunningTime="2026-03-07 21:44:13.702010886 +0000 UTC m=+1576.772715945" watchObservedRunningTime="2026-03-07 21:44:13.723471942 +0000 UTC m=+1576.794177001" Mar 07 21:44:13.756255 master-0 kubenswrapper[16352]: I0307 21:44:13.754419 16352 scope.go:117] "RemoveContainer" containerID="2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" Mar 07 21:44:13.759350 master-0 kubenswrapper[16352]: E0307 21:44:13.759271 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256\": container with ID starting with 2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256 not found: ID does not exist" containerID="2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" Mar 07 21:44:13.759487 master-0 kubenswrapper[16352]: I0307 21:44:13.759392 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256"} err="failed to get container status \"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256\": rpc error: code = NotFound desc = could not find container \"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256\": container with ID starting with 2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256 not found: ID does not exist" Mar 07 21:44:13.759582 master-0 kubenswrapper[16352]: I0307 21:44:13.759493 16352 scope.go:117] "RemoveContainer" containerID="b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" Mar 07 21:44:13.760075 master-0 kubenswrapper[16352]: E0307 21:44:13.760041 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750\": container with ID starting with b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750 not found: ID does not exist" containerID="b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" Mar 07 21:44:13.760148 master-0 kubenswrapper[16352]: I0307 21:44:13.760095 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750"} err="failed to get container status \"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750\": rpc error: code = NotFound desc = could not find container \"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750\": container with ID starting with b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750 not found: ID does not exist" Mar 07 21:44:13.760148 master-0 kubenswrapper[16352]: I0307 21:44:13.760122 16352 scope.go:117] "RemoveContainer" containerID="2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256" Mar 07 21:44:13.760616 master-0 kubenswrapper[16352]: I0307 21:44:13.760563 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256"} err="failed to get container status \"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256\": rpc error: code = NotFound desc = could not find container \"2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256\": container with ID starting with 2d60436e61b0f9f093ae85efcdb095803ab55bf09e37a60100e7478b12c9a256 not found: ID does not exist" Mar 07 21:44:13.760708 master-0 kubenswrapper[16352]: I0307 21:44:13.760622 16352 scope.go:117] "RemoveContainer" containerID="b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750" Mar 07 21:44:13.761796 master-0 kubenswrapper[16352]: I0307 21:44:13.761656 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750"} err="failed to get container status \"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750\": rpc error: code = NotFound desc = could not find container \"b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750\": container with ID starting with b734c7b35538c8efd52c4d1f3f940d4fd551b3d33f9f4924ff58ad6557be8750 not found: ID does not exist" Mar 07 21:44:13.773601 master-0 kubenswrapper[16352]: I0307 21:44:13.773441 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:13.817690 master-0 kubenswrapper[16352]: I0307 21:44:13.817567 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:13.837842 master-0 kubenswrapper[16352]: I0307 21:44:13.837771 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:13.838571 master-0 kubenswrapper[16352]: E0307 21:44:13.838548 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-metadata" Mar 07 21:44:13.838571 master-0 kubenswrapper[16352]: I0307 21:44:13.838570 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-metadata" Mar 07 21:44:13.838664 master-0 kubenswrapper[16352]: E0307 21:44:13.838599 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-log" Mar 07 21:44:13.838664 master-0 kubenswrapper[16352]: I0307 21:44:13.838606 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-log" Mar 07 21:44:13.838931 master-0 kubenswrapper[16352]: I0307 21:44:13.838911 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-metadata" Mar 07 21:44:13.839007 master-0 kubenswrapper[16352]: I0307 21:44:13.838992 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" containerName="nova-metadata-log" Mar 07 21:44:13.840728 master-0 kubenswrapper[16352]: I0307 21:44:13.840708 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:13.844892 master-0 kubenswrapper[16352]: I0307 21:44:13.844862 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 21:44:13.845117 master-0 kubenswrapper[16352]: I0307 21:44:13.845085 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 07 21:44:13.857931 master-0 kubenswrapper[16352]: I0307 21:44:13.857811 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:14.019454 master-0 kubenswrapper[16352]: I0307 21:44:14.019370 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.019454 master-0 kubenswrapper[16352]: I0307 21:44:14.019448 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.019784 master-0 kubenswrapper[16352]: I0307 21:44:14.019731 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.020131 master-0 kubenswrapper[16352]: I0307 21:44:14.020078 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzh2q\" (UniqueName: \"kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.020198 master-0 kubenswrapper[16352]: I0307 21:44:14.020148 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.125906 master-0 kubenswrapper[16352]: I0307 21:44:14.125800 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.125906 master-0 kubenswrapper[16352]: I0307 21:44:14.125924 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.126305 master-0 kubenswrapper[16352]: I0307 21:44:14.126037 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzh2q\" (UniqueName: \"kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.126305 master-0 kubenswrapper[16352]: I0307 21:44:14.126059 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.126305 master-0 kubenswrapper[16352]: I0307 21:44:14.126234 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.127438 master-0 kubenswrapper[16352]: I0307 21:44:14.127363 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.130577 master-0 kubenswrapper[16352]: I0307 21:44:14.130390 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.131015 master-0 kubenswrapper[16352]: I0307 21:44:14.130958 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.131095 master-0 kubenswrapper[16352]: I0307 21:44:14.131022 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.144867 master-0 kubenswrapper[16352]: I0307 21:44:14.144813 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzh2q\" (UniqueName: \"kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q\") pod \"nova-metadata-0\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " pod="openstack/nova-metadata-0" Mar 07 21:44:14.176607 master-0 kubenswrapper[16352]: I0307 21:44:14.176507 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:14.658959 master-0 kubenswrapper[16352]: I0307 21:44:14.658793 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 07 21:44:14.659587 master-0 kubenswrapper[16352]: I0307 21:44:14.659326 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ironic-conductor-0" Mar 07 21:44:14.710480 master-0 kubenswrapper[16352]: I0307 21:44:14.709744 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ironic-conductor-0" Mar 07 21:44:14.742153 master-0 kubenswrapper[16352]: I0307 21:44:14.741781 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:15.215977 master-0 kubenswrapper[16352]: I0307 21:44:15.215894 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd90cad-b881-461f-8fa9-6e1d5925f6f6" path="/var/lib/kubelet/pods/afd90cad-b881-461f-8fa9-6e1d5925f6f6/volumes" Mar 07 21:44:15.690500 master-0 kubenswrapper[16352]: I0307 21:44:15.690420 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerStarted","Data":"4783df70101e45f229bd0449d2b64be860820a6dd390e90547b6dd4bf0703b9a"} Mar 07 21:44:15.690500 master-0 kubenswrapper[16352]: I0307 21:44:15.690490 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerStarted","Data":"c4072772c678119f81ba88488649a93dc34873a721c20e5d8d5da5bace64ba6b"} Mar 07 21:44:15.690500 master-0 kubenswrapper[16352]: I0307 21:44:15.690503 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerStarted","Data":"a9a7d24e6824ffbe6b7e8576eb9b7302600ebdb9e2397cbe9c47777b939285ba"} Mar 07 21:44:15.733199 master-0 kubenswrapper[16352]: I0307 21:44:15.733085 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.733057715 podStartE2EDuration="2.733057715s" podCreationTimestamp="2026-03-07 21:44:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:15.719586482 +0000 UTC m=+1578.790291571" watchObservedRunningTime="2026-03-07 21:44:15.733057715 +0000 UTC m=+1578.803762774" Mar 07 21:44:16.176475 master-0 kubenswrapper[16352]: I0307 21:44:16.176389 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ironic-conductor-0" Mar 07 21:44:16.705034 master-0 kubenswrapper[16352]: I0307 21:44:16.704607 16352 generic.go:334] "Generic (PLEG): container finished" podID="bd475e91-463a-4538-84af-ba2b678d7f06" containerID="43d9dc91614e24b8ae37ce9d5e437173479c23869edf0df37c3e4904686c5776" exitCode=0 Mar 07 21:44:16.705034 master-0 kubenswrapper[16352]: I0307 21:44:16.705031 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c5sqt" event={"ID":"bd475e91-463a-4538-84af-ba2b678d7f06","Type":"ContainerDied","Data":"43d9dc91614e24b8ae37ce9d5e437173479c23869edf0df37c3e4904686c5776"} Mar 07 21:44:16.762211 master-0 kubenswrapper[16352]: I0307 21:44:16.762125 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 07 21:44:17.216314 master-0 kubenswrapper[16352]: I0307 21:44:17.216233 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 21:44:17.216314 master-0 kubenswrapper[16352]: I0307 21:44:17.216324 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 21:44:17.252189 master-0 kubenswrapper[16352]: I0307 21:44:17.252100 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 21:44:17.337560 master-0 kubenswrapper[16352]: I0307 21:44:17.337491 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:44:17.337560 master-0 kubenswrapper[16352]: I0307 21:44:17.337564 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:44:17.361142 master-0 kubenswrapper[16352]: I0307 21:44:17.361064 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:17.492784 master-0 kubenswrapper[16352]: I0307 21:44:17.492613 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:44:17.627063 master-0 kubenswrapper[16352]: I0307 21:44:17.626298 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:44:17.627063 master-0 kubenswrapper[16352]: I0307 21:44:17.626596 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="dnsmasq-dns" containerID="cri-o://1a77f04dda3772bff59116945cc0f533e1a0ed0929704e64ead5e1e2c1ad0583" gracePeriod=10 Mar 07 21:44:17.735805 master-0 kubenswrapper[16352]: I0307 21:44:17.735382 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ironic-conductor-0" Mar 07 21:44:17.827715 master-0 kubenswrapper[16352]: I0307 21:44:17.827596 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 21:44:18.380028 master-0 kubenswrapper[16352]: I0307 21:44:18.379942 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:18.423651 master-0 kubenswrapper[16352]: I0307 21:44:18.423542 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.2:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:19.184600 master-0 kubenswrapper[16352]: I0307 21:44:19.176690 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:44:19.184600 master-0 kubenswrapper[16352]: I0307 21:44:19.176813 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:44:19.722245 master-0 kubenswrapper[16352]: I0307 21:44:19.722039 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.128.0.241:5353: connect: connection refused" Mar 07 21:44:22.836307 master-0 kubenswrapper[16352]: I0307 21:44:22.836205 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-c5sqt" event={"ID":"bd475e91-463a-4538-84af-ba2b678d7f06","Type":"ContainerDied","Data":"ae3119a195741ab2c81cdcb91396dc058d9ff9ce6cd2a4644470c3587c51748a"} Mar 07 21:44:22.836307 master-0 kubenswrapper[16352]: I0307 21:44:22.836303 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae3119a195741ab2c81cdcb91396dc058d9ff9ce6cd2a4644470c3587c51748a" Mar 07 21:44:22.840599 master-0 kubenswrapper[16352]: I0307 21:44:22.840508 16352 generic.go:334] "Generic (PLEG): container finished" podID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerID="1a77f04dda3772bff59116945cc0f533e1a0ed0929704e64ead5e1e2c1ad0583" exitCode=0 Mar 07 21:44:22.840757 master-0 kubenswrapper[16352]: I0307 21:44:22.840624 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" event={"ID":"27e6c72c-28fc-4783-a670-31fe4f9b98fe","Type":"ContainerDied","Data":"1a77f04dda3772bff59116945cc0f533e1a0ed0929704e64ead5e1e2c1ad0583"} Mar 07 21:44:22.859092 master-0 kubenswrapper[16352]: I0307 21:44:22.859014 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:22.983965 master-0 kubenswrapper[16352]: I0307 21:44:22.983879 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data\") pod \"bd475e91-463a-4538-84af-ba2b678d7f06\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " Mar 07 21:44:22.984144 master-0 kubenswrapper[16352]: I0307 21:44:22.984055 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle\") pod \"bd475e91-463a-4538-84af-ba2b678d7f06\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " Mar 07 21:44:22.984144 master-0 kubenswrapper[16352]: I0307 21:44:22.984104 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts\") pod \"bd475e91-463a-4538-84af-ba2b678d7f06\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " Mar 07 21:44:22.985962 master-0 kubenswrapper[16352]: I0307 21:44:22.984269 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wcvgg\" (UniqueName: \"kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg\") pod \"bd475e91-463a-4538-84af-ba2b678d7f06\" (UID: \"bd475e91-463a-4538-84af-ba2b678d7f06\") " Mar 07 21:44:22.989133 master-0 kubenswrapper[16352]: I0307 21:44:22.989035 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg" (OuterVolumeSpecName: "kube-api-access-wcvgg") pod "bd475e91-463a-4538-84af-ba2b678d7f06" (UID: "bd475e91-463a-4538-84af-ba2b678d7f06"). InnerVolumeSpecName "kube-api-access-wcvgg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:22.990040 master-0 kubenswrapper[16352]: I0307 21:44:22.989982 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts" (OuterVolumeSpecName: "scripts") pod "bd475e91-463a-4538-84af-ba2b678d7f06" (UID: "bd475e91-463a-4538-84af-ba2b678d7f06"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:23.018327 master-0 kubenswrapper[16352]: I0307 21:44:23.018153 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data" (OuterVolumeSpecName: "config-data") pod "bd475e91-463a-4538-84af-ba2b678d7f06" (UID: "bd475e91-463a-4538-84af-ba2b678d7f06"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:23.029053 master-0 kubenswrapper[16352]: I0307 21:44:23.028966 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd475e91-463a-4538-84af-ba2b678d7f06" (UID: "bd475e91-463a-4538-84af-ba2b678d7f06"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:23.090895 master-0 kubenswrapper[16352]: I0307 21:44:23.090824 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.090895 master-0 kubenswrapper[16352]: I0307 21:44:23.090882 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.090895 master-0 kubenswrapper[16352]: I0307 21:44:23.090896 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bd475e91-463a-4538-84af-ba2b678d7f06-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.090895 master-0 kubenswrapper[16352]: I0307 21:44:23.090909 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wcvgg\" (UniqueName: \"kubernetes.io/projected/bd475e91-463a-4538-84af-ba2b678d7f06-kube-api-access-wcvgg\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.292083 master-0 kubenswrapper[16352]: I0307 21:44:23.292016 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:44:23.411267 master-0 kubenswrapper[16352]: I0307 21:44:23.411193 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.411735 master-0 kubenswrapper[16352]: I0307 21:44:23.411672 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.418883 master-0 kubenswrapper[16352]: I0307 21:44:23.418836 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.419142 master-0 kubenswrapper[16352]: I0307 21:44:23.419119 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.419337 master-0 kubenswrapper[16352]: I0307 21:44:23.419320 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.419512 master-0 kubenswrapper[16352]: I0307 21:44:23.419494 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dfs6m\" (UniqueName: \"kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m\") pod \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\" (UID: \"27e6c72c-28fc-4783-a670-31fe4f9b98fe\") " Mar 07 21:44:23.429833 master-0 kubenswrapper[16352]: I0307 21:44:23.429248 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m" (OuterVolumeSpecName: "kube-api-access-dfs6m") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "kube-api-access-dfs6m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:23.485848 master-0 kubenswrapper[16352]: I0307 21:44:23.485785 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:44:23.491729 master-0 kubenswrapper[16352]: I0307 21:44:23.491613 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:44:23.497776 master-0 kubenswrapper[16352]: I0307 21:44:23.496021 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config" (OuterVolumeSpecName: "config") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:44:23.502643 master-0 kubenswrapper[16352]: I0307 21:44:23.502568 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:44:23.512001 master-0 kubenswrapper[16352]: I0307 21:44:23.511945 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "27e6c72c-28fc-4783-a670-31fe4f9b98fe" (UID: "27e6c72c-28fc-4783-a670-31fe4f9b98fe"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:44:23.525242 master-0 kubenswrapper[16352]: I0307 21:44:23.525171 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.525242 master-0 kubenswrapper[16352]: I0307 21:44:23.525233 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.525502 master-0 kubenswrapper[16352]: I0307 21:44:23.525250 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.525502 master-0 kubenswrapper[16352]: I0307 21:44:23.525264 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dfs6m\" (UniqueName: \"kubernetes.io/projected/27e6c72c-28fc-4783-a670-31fe4f9b98fe-kube-api-access-dfs6m\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.525502 master-0 kubenswrapper[16352]: I0307 21:44:23.525275 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.525502 master-0 kubenswrapper[16352]: I0307 21:44:23.525285 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/27e6c72c-28fc-4783-a670-31fe4f9b98fe-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:23.863221 master-0 kubenswrapper[16352]: I0307 21:44:23.863124 16352 generic.go:334] "Generic (PLEG): container finished" podID="a726ded1-f768-48e1-87c1-1b99262d45e1" containerID="bc13e8beb36012f16bde41d8bcda1e30055c34fd50a42f60875a5404f6660ecf" exitCode=0 Mar 07 21:44:23.864178 master-0 kubenswrapper[16352]: I0307 21:44:23.863337 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2rz24" event={"ID":"a726ded1-f768-48e1-87c1-1b99262d45e1","Type":"ContainerDied","Data":"bc13e8beb36012f16bde41d8bcda1e30055c34fd50a42f60875a5404f6660ecf"} Mar 07 21:44:23.868532 master-0 kubenswrapper[16352]: I0307 21:44:23.868451 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" Mar 07 21:44:23.868739 master-0 kubenswrapper[16352]: I0307 21:44:23.868449 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7754f44b87-jrdnd" event={"ID":"27e6c72c-28fc-4783-a670-31fe4f9b98fe","Type":"ContainerDied","Data":"1a11555c46d47147ea59dd5f39a02f1f954b6f92e0fccf4380cbe37bdd69e46e"} Mar 07 21:44:23.868830 master-0 kubenswrapper[16352]: I0307 21:44:23.868740 16352 scope.go:117] "RemoveContainer" containerID="1a77f04dda3772bff59116945cc0f533e1a0ed0929704e64ead5e1e2c1ad0583" Mar 07 21:44:23.871753 master-0 kubenswrapper[16352]: I0307 21:44:23.871627 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-compute-ironic-compute-0" event={"ID":"002ef748-e065-42d6-8fa2-11d819c5ce98","Type":"ContainerStarted","Data":"0b1349bb944895e697d494c7c15e7bc0d69e58814dd7169a391ec89761896cda"} Mar 07 21:44:23.871928 master-0 kubenswrapper[16352]: I0307 21:44:23.871653 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-c5sqt" Mar 07 21:44:23.921648 master-0 kubenswrapper[16352]: I0307 21:44:23.921575 16352 scope.go:117] "RemoveContainer" containerID="87863a99040258e3132a9d56d477a96b22234d6ff00c03d87ba512e02820601c" Mar 07 21:44:23.967944 master-0 kubenswrapper[16352]: I0307 21:44:23.967742 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-compute-ironic-compute-0" podStartSLOduration=3.196396474 podStartE2EDuration="17.967714887s" podCreationTimestamp="2026-03-07 21:44:06 +0000 UTC" firstStartedPulling="2026-03-07 21:44:08.116716668 +0000 UTC m=+1571.187421727" lastFinishedPulling="2026-03-07 21:44:22.888035081 +0000 UTC m=+1585.958740140" observedRunningTime="2026-03-07 21:44:23.928534846 +0000 UTC m=+1586.999240015" watchObservedRunningTime="2026-03-07 21:44:23.967714887 +0000 UTC m=+1587.038419936" Mar 07 21:44:23.986004 master-0 kubenswrapper[16352]: I0307 21:44:23.985925 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:44:23.998663 master-0 kubenswrapper[16352]: I0307 21:44:23.998576 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7754f44b87-jrdnd"] Mar 07 21:44:24.106903 master-0 kubenswrapper[16352]: I0307 21:44:24.106827 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:24.107160 master-0 kubenswrapper[16352]: I0307 21:44:24.107108 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-log" containerID="cri-o://79049fce9fb2a5ced5424220a56dd87c4ecfa50bb4de1709f5a077e0bf3916c8" gracePeriod=30 Mar 07 21:44:24.107699 master-0 kubenswrapper[16352]: I0307 21:44:24.107652 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-api" containerID="cri-o://e2dedb2cb9585b01fbdec6d2c22662c5b76db2a1bf6d57f77c4d5b94eace2971" gracePeriod=30 Mar 07 21:44:24.126748 master-0 kubenswrapper[16352]: I0307 21:44:24.125877 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:24.126748 master-0 kubenswrapper[16352]: I0307 21:44:24.126160 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="e787124b-7250-4b3f-953e-b91655e82506" containerName="nova-scheduler-scheduler" containerID="cri-o://970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" gracePeriod=30 Mar 07 21:44:24.136938 master-0 kubenswrapper[16352]: I0307 21:44:24.136875 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:24.137164 master-0 kubenswrapper[16352]: I0307 21:44:24.137107 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-log" containerID="cri-o://c4072772c678119f81ba88488649a93dc34873a721c20e5d8d5da5bace64ba6b" gracePeriod=30 Mar 07 21:44:24.137306 master-0 kubenswrapper[16352]: I0307 21:44:24.137269 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-metadata" containerID="cri-o://4783df70101e45f229bd0449d2b64be860820a6dd390e90547b6dd4bf0703b9a" gracePeriod=30 Mar 07 21:44:24.929946 master-0 kubenswrapper[16352]: I0307 21:44:24.923774 16352 generic.go:334] "Generic (PLEG): container finished" podID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerID="79049fce9fb2a5ced5424220a56dd87c4ecfa50bb4de1709f5a077e0bf3916c8" exitCode=143 Mar 07 21:44:24.929946 master-0 kubenswrapper[16352]: I0307 21:44:24.923880 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerDied","Data":"79049fce9fb2a5ced5424220a56dd87c4ecfa50bb4de1709f5a077e0bf3916c8"} Mar 07 21:44:24.932486 master-0 kubenswrapper[16352]: I0307 21:44:24.932412 16352 generic.go:334] "Generic (PLEG): container finished" podID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerID="4783df70101e45f229bd0449d2b64be860820a6dd390e90547b6dd4bf0703b9a" exitCode=0 Mar 07 21:44:24.932486 master-0 kubenswrapper[16352]: I0307 21:44:24.932470 16352 generic.go:334] "Generic (PLEG): container finished" podID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerID="c4072772c678119f81ba88488649a93dc34873a721c20e5d8d5da5bace64ba6b" exitCode=143 Mar 07 21:44:24.932708 master-0 kubenswrapper[16352]: I0307 21:44:24.932477 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerDied","Data":"4783df70101e45f229bd0449d2b64be860820a6dd390e90547b6dd4bf0703b9a"} Mar 07 21:44:24.932708 master-0 kubenswrapper[16352]: I0307 21:44:24.932530 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerDied","Data":"c4072772c678119f81ba88488649a93dc34873a721c20e5d8d5da5bace64ba6b"} Mar 07 21:44:24.933179 master-0 kubenswrapper[16352]: I0307 21:44:24.933146 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:24.977010 master-0 kubenswrapper[16352]: I0307 21:44:24.976896 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-compute-ironic-compute-0" Mar 07 21:44:25.257584 master-0 kubenswrapper[16352]: I0307 21:44:25.257429 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:25.303486 master-0 kubenswrapper[16352]: I0307 21:44:25.303405 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" path="/var/lib/kubelet/pods/27e6c72c-28fc-4783-a670-31fe4f9b98fe/volumes" Mar 07 21:44:25.374629 master-0 kubenswrapper[16352]: I0307 21:44:25.374533 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data\") pod \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " Mar 07 21:44:25.375041 master-0 kubenswrapper[16352]: I0307 21:44:25.374711 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs\") pod \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " Mar 07 21:44:25.375041 master-0 kubenswrapper[16352]: I0307 21:44:25.374808 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzh2q\" (UniqueName: \"kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q\") pod \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " Mar 07 21:44:25.375166 master-0 kubenswrapper[16352]: I0307 21:44:25.375131 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs\") pod \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " Mar 07 21:44:25.375458 master-0 kubenswrapper[16352]: I0307 21:44:25.375421 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle\") pod \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\" (UID: \"700e9591-0d76-4e95-bf4b-9e15d6164fdc\") " Mar 07 21:44:25.378193 master-0 kubenswrapper[16352]: I0307 21:44:25.378147 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs" (OuterVolumeSpecName: "logs") pod "700e9591-0d76-4e95-bf4b-9e15d6164fdc" (UID: "700e9591-0d76-4e95-bf4b-9e15d6164fdc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:44:25.394804 master-0 kubenswrapper[16352]: I0307 21:44:25.383077 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q" (OuterVolumeSpecName: "kube-api-access-rzh2q") pod "700e9591-0d76-4e95-bf4b-9e15d6164fdc" (UID: "700e9591-0d76-4e95-bf4b-9e15d6164fdc"). InnerVolumeSpecName "kube-api-access-rzh2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:25.431136 master-0 kubenswrapper[16352]: I0307 21:44:25.425890 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data" (OuterVolumeSpecName: "config-data") pod "700e9591-0d76-4e95-bf4b-9e15d6164fdc" (UID: "700e9591-0d76-4e95-bf4b-9e15d6164fdc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.431136 master-0 kubenswrapper[16352]: I0307 21:44:25.427490 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "700e9591-0d76-4e95-bf4b-9e15d6164fdc" (UID: "700e9591-0d76-4e95-bf4b-9e15d6164fdc"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.479912 master-0 kubenswrapper[16352]: I0307 21:44:25.479798 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.479912 master-0 kubenswrapper[16352]: I0307 21:44:25.479902 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.479912 master-0 kubenswrapper[16352]: I0307 21:44:25.479918 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzh2q\" (UniqueName: \"kubernetes.io/projected/700e9591-0d76-4e95-bf4b-9e15d6164fdc-kube-api-access-rzh2q\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.480170 master-0 kubenswrapper[16352]: I0307 21:44:25.479929 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/700e9591-0d76-4e95-bf4b-9e15d6164fdc-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.517549 master-0 kubenswrapper[16352]: I0307 21:44:25.517348 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "700e9591-0d76-4e95-bf4b-9e15d6164fdc" (UID: "700e9591-0d76-4e95-bf4b-9e15d6164fdc"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.578883 master-0 kubenswrapper[16352]: I0307 21:44:25.578360 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:25.583547 master-0 kubenswrapper[16352]: I0307 21:44:25.583404 16352 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/700e9591-0d76-4e95-bf4b-9e15d6164fdc-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.687332 master-0 kubenswrapper[16352]: I0307 21:44:25.687236 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-46lxz\" (UniqueName: \"kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz\") pod \"a726ded1-f768-48e1-87c1-1b99262d45e1\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " Mar 07 21:44:25.687586 master-0 kubenswrapper[16352]: I0307 21:44:25.687485 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts\") pod \"a726ded1-f768-48e1-87c1-1b99262d45e1\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " Mar 07 21:44:25.687730 master-0 kubenswrapper[16352]: I0307 21:44:25.687699 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data\") pod \"a726ded1-f768-48e1-87c1-1b99262d45e1\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " Mar 07 21:44:25.687778 master-0 kubenswrapper[16352]: I0307 21:44:25.687737 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle\") pod \"a726ded1-f768-48e1-87c1-1b99262d45e1\" (UID: \"a726ded1-f768-48e1-87c1-1b99262d45e1\") " Mar 07 21:44:25.693106 master-0 kubenswrapper[16352]: I0307 21:44:25.693039 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz" (OuterVolumeSpecName: "kube-api-access-46lxz") pod "a726ded1-f768-48e1-87c1-1b99262d45e1" (UID: "a726ded1-f768-48e1-87c1-1b99262d45e1"). InnerVolumeSpecName "kube-api-access-46lxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:25.693478 master-0 kubenswrapper[16352]: I0307 21:44:25.693393 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts" (OuterVolumeSpecName: "scripts") pod "a726ded1-f768-48e1-87c1-1b99262d45e1" (UID: "a726ded1-f768-48e1-87c1-1b99262d45e1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.718700 master-0 kubenswrapper[16352]: I0307 21:44:25.718597 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data" (OuterVolumeSpecName: "config-data") pod "a726ded1-f768-48e1-87c1-1b99262d45e1" (UID: "a726ded1-f768-48e1-87c1-1b99262d45e1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.734606 master-0 kubenswrapper[16352]: I0307 21:44:25.734522 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a726ded1-f768-48e1-87c1-1b99262d45e1" (UID: "a726ded1-f768-48e1-87c1-1b99262d45e1"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:25.795051 master-0 kubenswrapper[16352]: I0307 21:44:25.794873 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-46lxz\" (UniqueName: \"kubernetes.io/projected/a726ded1-f768-48e1-87c1-1b99262d45e1-kube-api-access-46lxz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.795051 master-0 kubenswrapper[16352]: I0307 21:44:25.794950 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.795051 master-0 kubenswrapper[16352]: I0307 21:44:25.794972 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.795051 master-0 kubenswrapper[16352]: I0307 21:44:25.794986 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a726ded1-f768-48e1-87c1-1b99262d45e1-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:25.970309 master-0 kubenswrapper[16352]: I0307 21:44:25.970228 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-2rz24" Mar 07 21:44:25.971041 master-0 kubenswrapper[16352]: I0307 21:44:25.970208 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-2rz24" event={"ID":"a726ded1-f768-48e1-87c1-1b99262d45e1","Type":"ContainerDied","Data":"68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c"} Mar 07 21:44:25.971041 master-0 kubenswrapper[16352]: I0307 21:44:25.970470 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="68db8146aafed22084c32abd309143a43cfdf9c551262a6fd97522091a99fc6c" Mar 07 21:44:25.981903 master-0 kubenswrapper[16352]: I0307 21:44:25.981779 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"700e9591-0d76-4e95-bf4b-9e15d6164fdc","Type":"ContainerDied","Data":"a9a7d24e6824ffbe6b7e8576eb9b7302600ebdb9e2397cbe9c47777b939285ba"} Mar 07 21:44:25.982295 master-0 kubenswrapper[16352]: I0307 21:44:25.981893 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:25.982415 master-0 kubenswrapper[16352]: I0307 21:44:25.982299 16352 scope.go:117] "RemoveContainer" containerID="4783df70101e45f229bd0449d2b64be860820a6dd390e90547b6dd4bf0703b9a" Mar 07 21:44:26.049145 master-0 kubenswrapper[16352]: I0307 21:44:26.049096 16352 scope.go:117] "RemoveContainer" containerID="c4072772c678119f81ba88488649a93dc34873a721c20e5d8d5da5bace64ba6b" Mar 07 21:44:26.120875 master-0 kubenswrapper[16352]: I0307 21:44:26.120817 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:26.130585 master-0 kubenswrapper[16352]: I0307 21:44:26.130508 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131204 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-metadata" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131230 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-metadata" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131258 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="init" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131264 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="init" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131274 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd475e91-463a-4538-84af-ba2b678d7f06" containerName="nova-manage" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131300 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd475e91-463a-4538-84af-ba2b678d7f06" containerName="nova-manage" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131335 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-log" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131342 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-log" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131377 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a726ded1-f768-48e1-87c1-1b99262d45e1" containerName="nova-cell1-conductor-db-sync" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131383 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="a726ded1-f768-48e1-87c1-1b99262d45e1" containerName="nova-cell1-conductor-db-sync" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: E0307 21:44:26.131394 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="dnsmasq-dns" Mar 07 21:44:26.131391 master-0 kubenswrapper[16352]: I0307 21:44:26.131401 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="dnsmasq-dns" Mar 07 21:44:26.132057 master-0 kubenswrapper[16352]: I0307 21:44:26.131712 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="27e6c72c-28fc-4783-a670-31fe4f9b98fe" containerName="dnsmasq-dns" Mar 07 21:44:26.133982 master-0 kubenswrapper[16352]: I0307 21:44:26.133942 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd475e91-463a-4538-84af-ba2b678d7f06" containerName="nova-manage" Mar 07 21:44:26.134082 master-0 kubenswrapper[16352]: I0307 21:44:26.134008 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="a726ded1-f768-48e1-87c1-1b99262d45e1" containerName="nova-cell1-conductor-db-sync" Mar 07 21:44:26.134082 master-0 kubenswrapper[16352]: I0307 21:44:26.134028 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-metadata" Mar 07 21:44:26.134082 master-0 kubenswrapper[16352]: I0307 21:44:26.134045 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" containerName="nova-metadata-log" Mar 07 21:44:26.135220 master-0 kubenswrapper[16352]: I0307 21:44:26.135195 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.139716 master-0 kubenswrapper[16352]: I0307 21:44:26.139646 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 07 21:44:26.156155 master-0 kubenswrapper[16352]: I0307 21:44:26.156082 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:26.171875 master-0 kubenswrapper[16352]: I0307 21:44:26.171777 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 21:44:26.187157 master-0 kubenswrapper[16352]: I0307 21:44:26.185921 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:26.189625 master-0 kubenswrapper[16352]: I0307 21:44:26.189577 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:26.193458 master-0 kubenswrapper[16352]: I0307 21:44:26.193394 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 21:44:26.193523 master-0 kubenswrapper[16352]: I0307 21:44:26.193414 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 07 21:44:26.200799 master-0 kubenswrapper[16352]: I0307 21:44:26.200667 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:26.329924 master-0 kubenswrapper[16352]: I0307 21:44:26.329729 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l47wg\" (UniqueName: \"kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.330375 master-0 kubenswrapper[16352]: I0307 21:44:26.330356 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.330963 master-0 kubenswrapper[16352]: I0307 21:44:26.330887 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.331048 master-0 kubenswrapper[16352]: I0307 21:44:26.331010 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tr4d7\" (UniqueName: \"kubernetes.io/projected/359a7ec3-73e6-4dae-97c8-bc21e81a5952-kube-api-access-tr4d7\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.331471 master-0 kubenswrapper[16352]: I0307 21:44:26.331339 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.331628 master-0 kubenswrapper[16352]: I0307 21:44:26.331598 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.331723 master-0 kubenswrapper[16352]: I0307 21:44:26.331676 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.333053 master-0 kubenswrapper[16352]: I0307 21:44:26.333030 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.436386 master-0 kubenswrapper[16352]: I0307 21:44:26.436261 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l47wg\" (UniqueName: \"kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.436931 master-0 kubenswrapper[16352]: I0307 21:44:26.436440 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.436931 master-0 kubenswrapper[16352]: I0307 21:44:26.436538 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.436931 master-0 kubenswrapper[16352]: I0307 21:44:26.436567 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tr4d7\" (UniqueName: \"kubernetes.io/projected/359a7ec3-73e6-4dae-97c8-bc21e81a5952-kube-api-access-tr4d7\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.436931 master-0 kubenswrapper[16352]: I0307 21:44:26.436822 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.437287 master-0 kubenswrapper[16352]: I0307 21:44:26.437124 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.437287 master-0 kubenswrapper[16352]: I0307 21:44:26.437230 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.437287 master-0 kubenswrapper[16352]: I0307 21:44:26.437241 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.438430 master-0 kubenswrapper[16352]: I0307 21:44:26.438370 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.449820 master-0 kubenswrapper[16352]: I0307 21:44:26.442066 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.449820 master-0 kubenswrapper[16352]: I0307 21:44:26.442646 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.449820 master-0 kubenswrapper[16352]: I0307 21:44:26.442737 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.449820 master-0 kubenswrapper[16352]: I0307 21:44:26.442731 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.449820 master-0 kubenswrapper[16352]: I0307 21:44:26.444315 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/359a7ec3-73e6-4dae-97c8-bc21e81a5952-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.459976 master-0 kubenswrapper[16352]: I0307 21:44:26.459921 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l47wg\" (UniqueName: \"kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg\") pod \"nova-metadata-0\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " pod="openstack/nova-metadata-0" Mar 07 21:44:26.467895 master-0 kubenswrapper[16352]: I0307 21:44:26.467854 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tr4d7\" (UniqueName: \"kubernetes.io/projected/359a7ec3-73e6-4dae-97c8-bc21e81a5952-kube-api-access-tr4d7\") pod \"nova-cell1-conductor-0\" (UID: \"359a7ec3-73e6-4dae-97c8-bc21e81a5952\") " pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.508380 master-0 kubenswrapper[16352]: I0307 21:44:26.508293 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:26.524424 master-0 kubenswrapper[16352]: I0307 21:44:26.524337 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:44:27.216033 master-0 kubenswrapper[16352]: I0307 21:44:27.215930 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="700e9591-0d76-4e95-bf4b-9e15d6164fdc" path="/var/lib/kubelet/pods/700e9591-0d76-4e95-bf4b-9e15d6164fdc/volumes" Mar 07 21:44:27.220063 master-0 kubenswrapper[16352]: E0307 21:44:27.219965 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:44:27.224765 master-0 kubenswrapper[16352]: E0307 21:44:27.223314 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:44:27.226513 master-0 kubenswrapper[16352]: E0307 21:44:27.226462 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:44:27.226600 master-0 kubenswrapper[16352]: E0307 21:44:27.226509 16352 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="e787124b-7250-4b3f-953e-b91655e82506" containerName="nova-scheduler-scheduler" Mar 07 21:44:27.284794 master-0 kubenswrapper[16352]: W0307 21:44:27.284703 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podae9bf672_4d9f_4146_92d2_1dbf658dfcbb.slice/crio-d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1 WatchSource:0}: Error finding container d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1: Status 404 returned error can't find the container with id d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1 Mar 07 21:44:27.294481 master-0 kubenswrapper[16352]: I0307 21:44:27.294359 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:44:27.662149 master-0 kubenswrapper[16352]: I0307 21:44:27.662034 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 07 21:44:27.675976 master-0 kubenswrapper[16352]: W0307 21:44:27.675888 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod359a7ec3_73e6_4dae_97c8_bc21e81a5952.slice/crio-dc12eee96c4e8aef13a04734cd9daf06b0e92da47367f78358eeae8d875c02d0 WatchSource:0}: Error finding container dc12eee96c4e8aef13a04734cd9daf06b0e92da47367f78358eeae8d875c02d0: Status 404 returned error can't find the container with id dc12eee96c4e8aef13a04734cd9daf06b0e92da47367f78358eeae8d875c02d0 Mar 07 21:44:28.026200 master-0 kubenswrapper[16352]: I0307 21:44:28.026016 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"359a7ec3-73e6-4dae-97c8-bc21e81a5952","Type":"ContainerStarted","Data":"5005df429232d9582be9ae886d350b30a643022f0e513afaf5401e7bb57899c9"} Mar 07 21:44:28.026200 master-0 kubenswrapper[16352]: I0307 21:44:28.026095 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"359a7ec3-73e6-4dae-97c8-bc21e81a5952","Type":"ContainerStarted","Data":"dc12eee96c4e8aef13a04734cd9daf06b0e92da47367f78358eeae8d875c02d0"} Mar 07 21:44:28.029887 master-0 kubenswrapper[16352]: I0307 21:44:28.029845 16352 generic.go:334] "Generic (PLEG): container finished" podID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerID="e2dedb2cb9585b01fbdec6d2c22662c5b76db2a1bf6d57f77c4d5b94eace2971" exitCode=0 Mar 07 21:44:28.029994 master-0 kubenswrapper[16352]: I0307 21:44:28.029913 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerDied","Data":"e2dedb2cb9585b01fbdec6d2c22662c5b76db2a1bf6d57f77c4d5b94eace2971"} Mar 07 21:44:28.037367 master-0 kubenswrapper[16352]: I0307 21:44:28.037263 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerStarted","Data":"e1e87fade9c512cebf40ef76d354b9544a5dd6710efebf3df25c1d874089ecab"} Mar 07 21:44:28.037367 master-0 kubenswrapper[16352]: I0307 21:44:28.037379 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerStarted","Data":"d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1"} Mar 07 21:44:28.052303 master-0 kubenswrapper[16352]: I0307 21:44:28.052203 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.052158513 podStartE2EDuration="2.052158513s" podCreationTimestamp="2026-03-07 21:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:28.050598386 +0000 UTC m=+1591.121303455" watchObservedRunningTime="2026-03-07 21:44:28.052158513 +0000 UTC m=+1591.122863572" Mar 07 21:44:28.780402 master-0 kubenswrapper[16352]: I0307 21:44:28.780333 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:28.867138 master-0 kubenswrapper[16352]: I0307 21:44:28.867060 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs\") pod \"baf221a2-4d50-4677-b110-466e8d64e3a8\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " Mar 07 21:44:28.867510 master-0 kubenswrapper[16352]: I0307 21:44:28.867361 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data\") pod \"baf221a2-4d50-4677-b110-466e8d64e3a8\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " Mar 07 21:44:28.867510 master-0 kubenswrapper[16352]: I0307 21:44:28.867435 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle\") pod \"baf221a2-4d50-4677-b110-466e8d64e3a8\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " Mar 07 21:44:28.867597 master-0 kubenswrapper[16352]: I0307 21:44:28.867541 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbfrw\" (UniqueName: \"kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw\") pod \"baf221a2-4d50-4677-b110-466e8d64e3a8\" (UID: \"baf221a2-4d50-4677-b110-466e8d64e3a8\") " Mar 07 21:44:28.868216 master-0 kubenswrapper[16352]: I0307 21:44:28.867751 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs" (OuterVolumeSpecName: "logs") pod "baf221a2-4d50-4677-b110-466e8d64e3a8" (UID: "baf221a2-4d50-4677-b110-466e8d64e3a8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:44:28.868297 master-0 kubenswrapper[16352]: I0307 21:44:28.868239 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baf221a2-4d50-4677-b110-466e8d64e3a8-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:28.900130 master-0 kubenswrapper[16352]: I0307 21:44:28.900053 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "baf221a2-4d50-4677-b110-466e8d64e3a8" (UID: "baf221a2-4d50-4677-b110-466e8d64e3a8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:28.902554 master-0 kubenswrapper[16352]: I0307 21:44:28.900216 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw" (OuterVolumeSpecName: "kube-api-access-zbfrw") pod "baf221a2-4d50-4677-b110-466e8d64e3a8" (UID: "baf221a2-4d50-4677-b110-466e8d64e3a8"). InnerVolumeSpecName "kube-api-access-zbfrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:28.904634 master-0 kubenswrapper[16352]: I0307 21:44:28.904587 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data" (OuterVolumeSpecName: "config-data") pod "baf221a2-4d50-4677-b110-466e8d64e3a8" (UID: "baf221a2-4d50-4677-b110-466e8d64e3a8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:28.961234 master-0 kubenswrapper[16352]: E0307 21:44:28.961045 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode787124b_7250_4b3f_953e_b91655e82506.slice/crio-conmon-970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode787124b_7250_4b3f_953e_b91655e82506.slice/crio-970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a.scope\": RecentStats: unable to find data in memory cache]" Mar 07 21:44:28.971258 master-0 kubenswrapper[16352]: I0307 21:44:28.971081 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:28.971258 master-0 kubenswrapper[16352]: I0307 21:44:28.971160 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/baf221a2-4d50-4677-b110-466e8d64e3a8-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:28.971258 master-0 kubenswrapper[16352]: I0307 21:44:28.971180 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbfrw\" (UniqueName: \"kubernetes.io/projected/baf221a2-4d50-4677-b110-466e8d64e3a8-kube-api-access-zbfrw\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:29.071996 master-0 kubenswrapper[16352]: I0307 21:44:29.071657 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerStarted","Data":"6be1d1f530d0c7bdbc0d1c8090301072dfb8782fe72d092e1f23a3f4350c5f5c"} Mar 07 21:44:29.076506 master-0 kubenswrapper[16352]: I0307 21:44:29.076462 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:29.076715 master-0 kubenswrapper[16352]: I0307 21:44:29.076565 16352 generic.go:334] "Generic (PLEG): container finished" podID="e787124b-7250-4b3f-953e-b91655e82506" containerID="970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" exitCode=0 Mar 07 21:44:29.076775 master-0 kubenswrapper[16352]: I0307 21:44:29.076654 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e787124b-7250-4b3f-953e-b91655e82506","Type":"ContainerDied","Data":"970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a"} Mar 07 21:44:29.076880 master-0 kubenswrapper[16352]: I0307 21:44:29.076846 16352 scope.go:117] "RemoveContainer" containerID="970c2bfc1adb7fd9132ac97971378c59c36058365d8892ea356a014a12e7ca8a" Mar 07 21:44:29.080533 master-0 kubenswrapper[16352]: I0307 21:44:29.079989 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"baf221a2-4d50-4677-b110-466e8d64e3a8","Type":"ContainerDied","Data":"40b56398fa814da104fdf56b86b9603ba1174b63cb45154befeb070ac650c0b3"} Mar 07 21:44:29.080533 master-0 kubenswrapper[16352]: I0307 21:44:29.080044 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:29.080533 master-0 kubenswrapper[16352]: I0307 21:44:29.080091 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:29.125577 master-0 kubenswrapper[16352]: I0307 21:44:29.125299 16352 scope.go:117] "RemoveContainer" containerID="e2dedb2cb9585b01fbdec6d2c22662c5b76db2a1bf6d57f77c4d5b94eace2971" Mar 07 21:44:29.142305 master-0 kubenswrapper[16352]: I0307 21:44:29.142205 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=3.142179577 podStartE2EDuration="3.142179577s" podCreationTimestamp="2026-03-07 21:44:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:29.106782127 +0000 UTC m=+1592.177487186" watchObservedRunningTime="2026-03-07 21:44:29.142179577 +0000 UTC m=+1592.212884636" Mar 07 21:44:29.177000 master-0 kubenswrapper[16352]: I0307 21:44:29.176926 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jqp6w\" (UniqueName: \"kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w\") pod \"e787124b-7250-4b3f-953e-b91655e82506\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " Mar 07 21:44:29.178122 master-0 kubenswrapper[16352]: I0307 21:44:29.178091 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle\") pod \"e787124b-7250-4b3f-953e-b91655e82506\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " Mar 07 21:44:29.178198 master-0 kubenswrapper[16352]: I0307 21:44:29.178161 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data\") pod \"e787124b-7250-4b3f-953e-b91655e82506\" (UID: \"e787124b-7250-4b3f-953e-b91655e82506\") " Mar 07 21:44:29.186505 master-0 kubenswrapper[16352]: I0307 21:44:29.186454 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w" (OuterVolumeSpecName: "kube-api-access-jqp6w") pod "e787124b-7250-4b3f-953e-b91655e82506" (UID: "e787124b-7250-4b3f-953e-b91655e82506"). InnerVolumeSpecName "kube-api-access-jqp6w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:29.186606 master-0 kubenswrapper[16352]: I0307 21:44:29.186522 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:29.196668 master-0 kubenswrapper[16352]: I0307 21:44:29.193296 16352 scope.go:117] "RemoveContainer" containerID="79049fce9fb2a5ced5424220a56dd87c4ecfa50bb4de1709f5a077e0bf3916c8" Mar 07 21:44:29.250032 master-0 kubenswrapper[16352]: I0307 21:44:29.249970 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e787124b-7250-4b3f-953e-b91655e82506" (UID: "e787124b-7250-4b3f-953e-b91655e82506"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:29.260575 master-0 kubenswrapper[16352]: I0307 21:44:29.260529 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data" (OuterVolumeSpecName: "config-data") pod "e787124b-7250-4b3f-953e-b91655e82506" (UID: "e787124b-7250-4b3f-953e-b91655e82506"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:29.262520 master-0 kubenswrapper[16352]: I0307 21:44:29.262465 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:29.262646 master-0 kubenswrapper[16352]: I0307 21:44:29.262536 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:29.263418 master-0 kubenswrapper[16352]: E0307 21:44:29.263375 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-log" Mar 07 21:44:29.263418 master-0 kubenswrapper[16352]: I0307 21:44:29.263418 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-log" Mar 07 21:44:29.263653 master-0 kubenswrapper[16352]: E0307 21:44:29.263632 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e787124b-7250-4b3f-953e-b91655e82506" containerName="nova-scheduler-scheduler" Mar 07 21:44:29.263653 master-0 kubenswrapper[16352]: I0307 21:44:29.263649 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="e787124b-7250-4b3f-953e-b91655e82506" containerName="nova-scheduler-scheduler" Mar 07 21:44:29.263754 master-0 kubenswrapper[16352]: E0307 21:44:29.263706 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-api" Mar 07 21:44:29.263754 master-0 kubenswrapper[16352]: I0307 21:44:29.263715 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-api" Mar 07 21:44:29.264887 master-0 kubenswrapper[16352]: I0307 21:44:29.264746 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="e787124b-7250-4b3f-953e-b91655e82506" containerName="nova-scheduler-scheduler" Mar 07 21:44:29.264887 master-0 kubenswrapper[16352]: I0307 21:44:29.264797 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-api" Mar 07 21:44:29.264887 master-0 kubenswrapper[16352]: I0307 21:44:29.264832 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" containerName="nova-api-log" Mar 07 21:44:29.267081 master-0 kubenswrapper[16352]: I0307 21:44:29.267052 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:29.267334 master-0 kubenswrapper[16352]: I0307 21:44:29.267267 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:29.272839 master-0 kubenswrapper[16352]: I0307 21:44:29.272767 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 21:44:29.283847 master-0 kubenswrapper[16352]: I0307 21:44:29.282668 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jqp6w\" (UniqueName: \"kubernetes.io/projected/e787124b-7250-4b3f-953e-b91655e82506-kube-api-access-jqp6w\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:29.283847 master-0 kubenswrapper[16352]: I0307 21:44:29.282721 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:29.283847 master-0 kubenswrapper[16352]: I0307 21:44:29.282735 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e787124b-7250-4b3f-953e-b91655e82506-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:29.385115 master-0 kubenswrapper[16352]: I0307 21:44:29.384957 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrqm6\" (UniqueName: \"kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.385115 master-0 kubenswrapper[16352]: I0307 21:44:29.385037 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.385891 master-0 kubenswrapper[16352]: I0307 21:44:29.385823 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.385941 master-0 kubenswrapper[16352]: I0307 21:44:29.385923 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.490004 master-0 kubenswrapper[16352]: I0307 21:44:29.489926 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.490004 master-0 kubenswrapper[16352]: I0307 21:44:29.490002 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.490289 master-0 kubenswrapper[16352]: I0307 21:44:29.490059 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.490289 master-0 kubenswrapper[16352]: I0307 21:44:29.490078 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lrqm6\" (UniqueName: \"kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.491489 master-0 kubenswrapper[16352]: I0307 21:44:29.491450 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.502767 master-0 kubenswrapper[16352]: I0307 21:44:29.502706 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.504653 master-0 kubenswrapper[16352]: I0307 21:44:29.504602 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.511709 master-0 kubenswrapper[16352]: I0307 21:44:29.511645 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrqm6\" (UniqueName: \"kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6\") pod \"nova-api-0\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " pod="openstack/nova-api-0" Mar 07 21:44:29.649934 master-0 kubenswrapper[16352]: I0307 21:44:29.649753 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:30.097787 master-0 kubenswrapper[16352]: I0307 21:44:30.097670 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e787124b-7250-4b3f-953e-b91655e82506","Type":"ContainerDied","Data":"b1a2fbfda0113819e21e5a5cc73858da71238f92e353d0f9236aa4666f748de8"} Mar 07 21:44:30.098426 master-0 kubenswrapper[16352]: I0307 21:44:30.098131 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:30.163837 master-0 kubenswrapper[16352]: I0307 21:44:30.160722 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:30.182704 master-0 kubenswrapper[16352]: I0307 21:44:30.182266 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:30.194717 master-0 kubenswrapper[16352]: I0307 21:44:30.193255 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:30.230816 master-0 kubenswrapper[16352]: I0307 21:44:30.230017 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:30.236727 master-0 kubenswrapper[16352]: I0307 21:44:30.232944 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 21:44:30.248698 master-0 kubenswrapper[16352]: I0307 21:44:30.244465 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:30.305284 master-0 kubenswrapper[16352]: I0307 21:44:30.305164 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:30.423956 master-0 kubenswrapper[16352]: I0307 21:44:30.423840 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ncb2\" (UniqueName: \"kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.428402 master-0 kubenswrapper[16352]: I0307 21:44:30.428321 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.428497 master-0 kubenswrapper[16352]: I0307 21:44:30.428484 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.532721 master-0 kubenswrapper[16352]: I0307 21:44:30.531317 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.532721 master-0 kubenswrapper[16352]: I0307 21:44:30.531412 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.532721 master-0 kubenswrapper[16352]: I0307 21:44:30.531704 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ncb2\" (UniqueName: \"kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.542714 master-0 kubenswrapper[16352]: I0307 21:44:30.537264 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.542714 master-0 kubenswrapper[16352]: I0307 21:44:30.538932 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.558702 master-0 kubenswrapper[16352]: I0307 21:44:30.556916 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ncb2\" (UniqueName: \"kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2\") pod \"nova-scheduler-0\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " pod="openstack/nova-scheduler-0" Mar 07 21:44:30.570709 master-0 kubenswrapper[16352]: I0307 21:44:30.567760 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:44:31.122931 master-0 kubenswrapper[16352]: I0307 21:44:31.122355 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerStarted","Data":"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7"} Mar 07 21:44:31.122931 master-0 kubenswrapper[16352]: I0307 21:44:31.122425 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:44:31.122931 master-0 kubenswrapper[16352]: I0307 21:44:31.122447 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerStarted","Data":"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d"} Mar 07 21:44:31.122931 master-0 kubenswrapper[16352]: I0307 21:44:31.122457 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerStarted","Data":"07114260e12516bce67b48f1c7a1928971eedd67163dbd4df2b2446b4b3e5982"} Mar 07 21:44:31.153954 master-0 kubenswrapper[16352]: I0307 21:44:31.153817 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.153789449 podStartE2EDuration="2.153789449s" podCreationTimestamp="2026-03-07 21:44:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:31.143839661 +0000 UTC m=+1594.214544760" watchObservedRunningTime="2026-03-07 21:44:31.153789449 +0000 UTC m=+1594.224494518" Mar 07 21:44:31.208557 master-0 kubenswrapper[16352]: I0307 21:44:31.208441 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baf221a2-4d50-4677-b110-466e8d64e3a8" path="/var/lib/kubelet/pods/baf221a2-4d50-4677-b110-466e8d64e3a8/volumes" Mar 07 21:44:31.209531 master-0 kubenswrapper[16352]: I0307 21:44:31.209480 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e787124b-7250-4b3f-953e-b91655e82506" path="/var/lib/kubelet/pods/e787124b-7250-4b3f-953e-b91655e82506/volumes" Mar 07 21:44:31.525065 master-0 kubenswrapper[16352]: I0307 21:44:31.524815 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:44:31.525065 master-0 kubenswrapper[16352]: I0307 21:44:31.524971 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:44:32.148400 master-0 kubenswrapper[16352]: I0307 21:44:32.148270 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"acd4977f-05c9-482d-8c7b-4178e7ceb659","Type":"ContainerStarted","Data":"df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65"} Mar 07 21:44:32.148400 master-0 kubenswrapper[16352]: I0307 21:44:32.148374 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"acd4977f-05c9-482d-8c7b-4178e7ceb659","Type":"ContainerStarted","Data":"98448138001ef0184066282442e86a6627a0e452a5fae53594f1d4ebca89ad4c"} Mar 07 21:44:32.181822 master-0 kubenswrapper[16352]: I0307 21:44:32.181612 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.181580659 podStartE2EDuration="2.181580659s" podCreationTimestamp="2026-03-07 21:44:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:32.170904832 +0000 UTC m=+1595.241609931" watchObservedRunningTime="2026-03-07 21:44:32.181580659 +0000 UTC m=+1595.252285758" Mar 07 21:44:35.569128 master-0 kubenswrapper[16352]: I0307 21:44:35.569012 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 21:44:36.526007 master-0 kubenswrapper[16352]: I0307 21:44:36.525166 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 21:44:36.526007 master-0 kubenswrapper[16352]: I0307 21:44:36.525255 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 21:44:36.559275 master-0 kubenswrapper[16352]: I0307 21:44:36.559124 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 07 21:44:37.535064 master-0 kubenswrapper[16352]: I0307 21:44:37.534960 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:37.544040 master-0 kubenswrapper[16352]: I0307 21:44:37.543941 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:39.651519 master-0 kubenswrapper[16352]: I0307 21:44:39.651369 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:44:39.651519 master-0 kubenswrapper[16352]: I0307 21:44:39.651487 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:44:40.568961 master-0 kubenswrapper[16352]: I0307 21:44:40.568890 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 21:44:40.616767 master-0 kubenswrapper[16352]: I0307 21:44:40.616450 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 21:44:40.732098 master-0 kubenswrapper[16352]: I0307 21:44:40.732003 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.128.1.9:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:40.732710 master-0 kubenswrapper[16352]: I0307 21:44:40.732016 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.128.1.9:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 07 21:44:41.429060 master-0 kubenswrapper[16352]: I0307 21:44:41.428989 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 21:44:43.086196 master-0 kubenswrapper[16352]: I0307 21:44:43.086117 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.282150 master-0 kubenswrapper[16352]: I0307 21:44:43.281593 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle\") pod \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " Mar 07 21:44:43.282150 master-0 kubenswrapper[16352]: I0307 21:44:43.281785 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ftx7\" (UniqueName: \"kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7\") pod \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " Mar 07 21:44:43.282150 master-0 kubenswrapper[16352]: I0307 21:44:43.282112 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data\") pod \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\" (UID: \"4fd23d6c-2193-4d30-90d9-c34092e4dc62\") " Mar 07 21:44:43.285758 master-0 kubenswrapper[16352]: I0307 21:44:43.285605 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7" (OuterVolumeSpecName: "kube-api-access-4ftx7") pod "4fd23d6c-2193-4d30-90d9-c34092e4dc62" (UID: "4fd23d6c-2193-4d30-90d9-c34092e4dc62"). InnerVolumeSpecName "kube-api-access-4ftx7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:43.310968 master-0 kubenswrapper[16352]: I0307 21:44:43.310561 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4fd23d6c-2193-4d30-90d9-c34092e4dc62" (UID: "4fd23d6c-2193-4d30-90d9-c34092e4dc62"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:43.342327 master-0 kubenswrapper[16352]: I0307 21:44:43.342242 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data" (OuterVolumeSpecName: "config-data") pod "4fd23d6c-2193-4d30-90d9-c34092e4dc62" (UID: "4fd23d6c-2193-4d30-90d9-c34092e4dc62"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:43.391147 master-0 kubenswrapper[16352]: I0307 21:44:43.390699 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:43.391147 master-0 kubenswrapper[16352]: I0307 21:44:43.390749 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ftx7\" (UniqueName: \"kubernetes.io/projected/4fd23d6c-2193-4d30-90d9-c34092e4dc62-kube-api-access-4ftx7\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:43.391147 master-0 kubenswrapper[16352]: I0307 21:44:43.390764 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4fd23d6c-2193-4d30-90d9-c34092e4dc62-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:43.406537 master-0 kubenswrapper[16352]: I0307 21:44:43.406457 16352 generic.go:334] "Generic (PLEG): container finished" podID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" containerID="8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b" exitCode=137 Mar 07 21:44:43.406987 master-0 kubenswrapper[16352]: I0307 21:44:43.406563 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.407543 master-0 kubenswrapper[16352]: I0307 21:44:43.407333 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fd23d6c-2193-4d30-90d9-c34092e4dc62","Type":"ContainerDied","Data":"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b"} Mar 07 21:44:43.408023 master-0 kubenswrapper[16352]: I0307 21:44:43.407905 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"4fd23d6c-2193-4d30-90d9-c34092e4dc62","Type":"ContainerDied","Data":"26cae7ecc153d8b4387ce4fad05918f0e85de6a1f4fb5372290728435e3b80d9"} Mar 07 21:44:43.408430 master-0 kubenswrapper[16352]: I0307 21:44:43.407981 16352 scope.go:117] "RemoveContainer" containerID="8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b" Mar 07 21:44:43.433022 master-0 kubenswrapper[16352]: I0307 21:44:43.432973 16352 scope.go:117] "RemoveContainer" containerID="8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b" Mar 07 21:44:43.433755 master-0 kubenswrapper[16352]: E0307 21:44:43.433701 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b\": container with ID starting with 8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b not found: ID does not exist" containerID="8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b" Mar 07 21:44:43.434026 master-0 kubenswrapper[16352]: I0307 21:44:43.433771 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b"} err="failed to get container status \"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b\": rpc error: code = NotFound desc = could not find container \"8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b\": container with ID starting with 8668e9e484e49a737fc49ecfb69860076bc3ae8e3af50b47fd98e0024baef76b not found: ID does not exist" Mar 07 21:44:43.479109 master-0 kubenswrapper[16352]: I0307 21:44:43.479030 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:43.495134 master-0 kubenswrapper[16352]: I0307 21:44:43.495024 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:43.529713 master-0 kubenswrapper[16352]: I0307 21:44:43.519281 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:43.529713 master-0 kubenswrapper[16352]: E0307 21:44:43.520806 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 21:44:43.529713 master-0 kubenswrapper[16352]: I0307 21:44:43.520838 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 21:44:43.529713 master-0 kubenswrapper[16352]: I0307 21:44:43.521309 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" containerName="nova-cell1-novncproxy-novncproxy" Mar 07 21:44:43.529713 master-0 kubenswrapper[16352]: I0307 21:44:43.528929 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.532298 master-0 kubenswrapper[16352]: I0307 21:44:43.532168 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 07 21:44:43.532429 master-0 kubenswrapper[16352]: I0307 21:44:43.532405 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 07 21:44:43.532589 master-0 kubenswrapper[16352]: I0307 21:44:43.532559 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 07 21:44:43.557825 master-0 kubenswrapper[16352]: I0307 21:44:43.557758 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:43.701746 master-0 kubenswrapper[16352]: I0307 21:44:43.701618 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnd5b\" (UniqueName: \"kubernetes.io/projected/f9591dba-9e8c-49e4-8cf0-053534a98c2d-kube-api-access-tnd5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.702017 master-0 kubenswrapper[16352]: I0307 21:44:43.701784 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.702017 master-0 kubenswrapper[16352]: I0307 21:44:43.701859 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.702017 master-0 kubenswrapper[16352]: I0307 21:44:43.702004 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.702396 master-0 kubenswrapper[16352]: I0307 21:44:43.702328 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.805730 master-0 kubenswrapper[16352]: I0307 21:44:43.805462 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.805730 master-0 kubenswrapper[16352]: I0307 21:44:43.805612 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.806249 master-0 kubenswrapper[16352]: I0307 21:44:43.805910 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnd5b\" (UniqueName: \"kubernetes.io/projected/f9591dba-9e8c-49e4-8cf0-053534a98c2d-kube-api-access-tnd5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.806249 master-0 kubenswrapper[16352]: I0307 21:44:43.805973 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.806249 master-0 kubenswrapper[16352]: I0307 21:44:43.806020 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.816844 master-0 kubenswrapper[16352]: I0307 21:44:43.816754 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.817593 master-0 kubenswrapper[16352]: I0307 21:44:43.817247 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.834717 master-0 kubenswrapper[16352]: I0307 21:44:43.827653 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.844868 master-0 kubenswrapper[16352]: I0307 21:44:43.844810 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/f9591dba-9e8c-49e4-8cf0-053534a98c2d-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.849788 master-0 kubenswrapper[16352]: I0307 21:44:43.849731 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnd5b\" (UniqueName: \"kubernetes.io/projected/f9591dba-9e8c-49e4-8cf0-053534a98c2d-kube-api-access-tnd5b\") pod \"nova-cell1-novncproxy-0\" (UID: \"f9591dba-9e8c-49e4-8cf0-053534a98c2d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:43.870262 master-0 kubenswrapper[16352]: I0307 21:44:43.870196 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:44.462576 master-0 kubenswrapper[16352]: W0307 21:44:44.462461 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9591dba_9e8c_49e4_8cf0_053534a98c2d.slice/crio-cf11b0d5b837e35d7ac6a797e328c395ba21cf8c06208f72abfdc591c0343241 WatchSource:0}: Error finding container cf11b0d5b837e35d7ac6a797e328c395ba21cf8c06208f72abfdc591c0343241: Status 404 returned error can't find the container with id cf11b0d5b837e35d7ac6a797e328c395ba21cf8c06208f72abfdc591c0343241 Mar 07 21:44:44.464838 master-0 kubenswrapper[16352]: I0307 21:44:44.464657 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 07 21:44:45.207385 master-0 kubenswrapper[16352]: I0307 21:44:45.207310 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4fd23d6c-2193-4d30-90d9-c34092e4dc62" path="/var/lib/kubelet/pods/4fd23d6c-2193-4d30-90d9-c34092e4dc62/volumes" Mar 07 21:44:45.464410 master-0 kubenswrapper[16352]: I0307 21:44:45.464252 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9591dba-9e8c-49e4-8cf0-053534a98c2d","Type":"ContainerStarted","Data":"5dfa11f48ddd92468d5ec9bb32d0f43b34781615b537fa23fb1cbed9b58b3428"} Mar 07 21:44:45.464410 master-0 kubenswrapper[16352]: I0307 21:44:45.464317 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"f9591dba-9e8c-49e4-8cf0-053534a98c2d","Type":"ContainerStarted","Data":"cf11b0d5b837e35d7ac6a797e328c395ba21cf8c06208f72abfdc591c0343241"} Mar 07 21:44:45.494818 master-0 kubenswrapper[16352]: I0307 21:44:45.494649 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.494609799 podStartE2EDuration="2.494609799s" podCreationTimestamp="2026-03-07 21:44:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:45.484179779 +0000 UTC m=+1608.554884878" watchObservedRunningTime="2026-03-07 21:44:45.494609799 +0000 UTC m=+1608.565314898" Mar 07 21:44:46.536885 master-0 kubenswrapper[16352]: I0307 21:44:46.536785 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 21:44:46.537642 master-0 kubenswrapper[16352]: I0307 21:44:46.536937 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 21:44:46.543963 master-0 kubenswrapper[16352]: I0307 21:44:46.543816 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 21:44:46.545332 master-0 kubenswrapper[16352]: I0307 21:44:46.545274 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 21:44:48.871192 master-0 kubenswrapper[16352]: I0307 21:44:48.871052 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:49.654420 master-0 kubenswrapper[16352]: I0307 21:44:49.654329 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 21:44:49.655025 master-0 kubenswrapper[16352]: I0307 21:44:49.654966 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 21:44:49.655205 master-0 kubenswrapper[16352]: I0307 21:44:49.655154 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 21:44:49.657241 master-0 kubenswrapper[16352]: I0307 21:44:49.657202 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 21:44:50.549991 master-0 kubenswrapper[16352]: I0307 21:44:50.549877 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 21:44:50.555123 master-0 kubenswrapper[16352]: I0307 21:44:50.555051 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 21:44:50.833319 master-0 kubenswrapper[16352]: I0307 21:44:50.833220 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5cc8bb4897-sws9x"] Mar 07 21:44:50.842935 master-0 kubenswrapper[16352]: I0307 21:44:50.842716 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.853041 master-0 kubenswrapper[16352]: I0307 21:44:50.852970 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8bb4897-sws9x"] Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976319 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-config\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976434 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976468 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b97bd\" (UniqueName: \"kubernetes.io/projected/eda5e03f-7fc8-4738-80b7-07a1569df13a-kube-api-access-b97bd\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976521 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976580 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-svc\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:50.977094 master-0 kubenswrapper[16352]: I0307 21:44:50.976740 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.086508 master-0 kubenswrapper[16352]: I0307 21:44:51.086269 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-config\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.086508 master-0 kubenswrapper[16352]: I0307 21:44:51.086418 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.086508 master-0 kubenswrapper[16352]: I0307 21:44:51.086446 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b97bd\" (UniqueName: \"kubernetes.io/projected/eda5e03f-7fc8-4738-80b7-07a1569df13a-kube-api-access-b97bd\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.086508 master-0 kubenswrapper[16352]: I0307 21:44:51.086490 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.087564 master-0 kubenswrapper[16352]: I0307 21:44:51.087522 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-sb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.087662 master-0 kubenswrapper[16352]: I0307 21:44:51.087629 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-svc\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.090792 master-0 kubenswrapper[16352]: I0307 21:44:51.090725 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-ovsdbserver-nb\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.091344 master-0 kubenswrapper[16352]: I0307 21:44:51.091310 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-config\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.104709 master-0 kubenswrapper[16352]: I0307 21:44:51.104601 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.105088 master-0 kubenswrapper[16352]: I0307 21:44:51.104769 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-svc\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.105721 master-0 kubenswrapper[16352]: I0307 21:44:51.105696 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/eda5e03f-7fc8-4738-80b7-07a1569df13a-dns-swift-storage-0\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.135193 master-0 kubenswrapper[16352]: I0307 21:44:51.135137 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b97bd\" (UniqueName: \"kubernetes.io/projected/eda5e03f-7fc8-4738-80b7-07a1569df13a-kube-api-access-b97bd\") pod \"dnsmasq-dns-5cc8bb4897-sws9x\" (UID: \"eda5e03f-7fc8-4738-80b7-07a1569df13a\") " pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.206048 master-0 kubenswrapper[16352]: I0307 21:44:51.205990 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:51.733786 master-0 kubenswrapper[16352]: I0307 21:44:51.733738 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5cc8bb4897-sws9x"] Mar 07 21:44:52.607986 master-0 kubenswrapper[16352]: I0307 21:44:52.607920 16352 generic.go:334] "Generic (PLEG): container finished" podID="eda5e03f-7fc8-4738-80b7-07a1569df13a" containerID="2fdc0f65471b4bcda5b6c76c783ec618a1bb13d1b4a729d0296c79cf0262b454" exitCode=0 Mar 07 21:44:52.608324 master-0 kubenswrapper[16352]: I0307 21:44:52.608013 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" event={"ID":"eda5e03f-7fc8-4738-80b7-07a1569df13a","Type":"ContainerDied","Data":"2fdc0f65471b4bcda5b6c76c783ec618a1bb13d1b4a729d0296c79cf0262b454"} Mar 07 21:44:52.608386 master-0 kubenswrapper[16352]: I0307 21:44:52.608339 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" event={"ID":"eda5e03f-7fc8-4738-80b7-07a1569df13a","Type":"ContainerStarted","Data":"7d18c37eecdba02f518ac39ef41fa114a2b2c6ef99bf45d3d412745a1c10d39a"} Mar 07 21:44:53.366121 master-0 kubenswrapper[16352]: I0307 21:44:53.366038 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:53.628367 master-0 kubenswrapper[16352]: I0307 21:44:53.628173 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" event={"ID":"eda5e03f-7fc8-4738-80b7-07a1569df13a","Type":"ContainerStarted","Data":"270bf301315a2aa6f6625e3d87396be87187e1826a378814a0421c179898538c"} Mar 07 21:44:53.628367 master-0 kubenswrapper[16352]: I0307 21:44:53.628296 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-log" containerID="cri-o://a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d" gracePeriod=30 Mar 07 21:44:53.628666 master-0 kubenswrapper[16352]: I0307 21:44:53.628406 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-api" containerID="cri-o://15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7" gracePeriod=30 Mar 07 21:44:53.629183 master-0 kubenswrapper[16352]: I0307 21:44:53.628870 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:44:53.657258 master-0 kubenswrapper[16352]: I0307 21:44:53.657153 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" podStartSLOduration=3.657128158 podStartE2EDuration="3.657128158s" podCreationTimestamp="2026-03-07 21:44:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:53.651278708 +0000 UTC m=+1616.721983777" watchObservedRunningTime="2026-03-07 21:44:53.657128158 +0000 UTC m=+1616.727833227" Mar 07 21:44:53.871131 master-0 kubenswrapper[16352]: I0307 21:44:53.871049 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:53.891488 master-0 kubenswrapper[16352]: I0307 21:44:53.891312 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:54.648763 master-0 kubenswrapper[16352]: I0307 21:44:54.648664 16352 generic.go:334] "Generic (PLEG): container finished" podID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerID="a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d" exitCode=143 Mar 07 21:44:54.648763 master-0 kubenswrapper[16352]: I0307 21:44:54.648733 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerDied","Data":"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d"} Mar 07 21:44:54.665448 master-0 kubenswrapper[16352]: I0307 21:44:54.665364 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 07 21:44:54.912033 master-0 kubenswrapper[16352]: I0307 21:44:54.911883 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-8cwkr"] Mar 07 21:44:54.914267 master-0 kubenswrapper[16352]: I0307 21:44:54.914226 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:54.917320 master-0 kubenswrapper[16352]: I0307 21:44:54.917285 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 07 21:44:54.917670 master-0 kubenswrapper[16352]: I0307 21:44:54.917605 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 07 21:44:54.939986 master-0 kubenswrapper[16352]: I0307 21:44:54.939925 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-host-discover-8g65x"] Mar 07 21:44:54.942475 master-0 kubenswrapper[16352]: I0307 21:44:54.942441 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:54.957758 master-0 kubenswrapper[16352]: I0307 21:44:54.957664 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8cwkr"] Mar 07 21:44:54.992107 master-0 kubenswrapper[16352]: I0307 21:44:54.992035 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-8g65x"] Mar 07 21:44:55.039876 master-0 kubenswrapper[16352]: I0307 21:44:55.039806 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.040147 master-0 kubenswrapper[16352]: I0307 21:44:55.039981 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.040147 master-0 kubenswrapper[16352]: I0307 21:44:55.040028 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.040147 master-0 kubenswrapper[16352]: I0307 21:44:55.040125 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-48cck\" (UniqueName: \"kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.040288 master-0 kubenswrapper[16352]: I0307 21:44:55.040155 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.040288 master-0 kubenswrapper[16352]: I0307 21:44:55.040207 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.040351 master-0 kubenswrapper[16352]: I0307 21:44:55.040325 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5qxqz\" (UniqueName: \"kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.040385 master-0 kubenswrapper[16352]: I0307 21:44:55.040360 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.144910 master-0 kubenswrapper[16352]: I0307 21:44:55.144790 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.145212 master-0 kubenswrapper[16352]: I0307 21:44:55.144995 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.145212 master-0 kubenswrapper[16352]: I0307 21:44:55.145073 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.145359 master-0 kubenswrapper[16352]: I0307 21:44:55.145269 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-48cck\" (UniqueName: \"kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.145359 master-0 kubenswrapper[16352]: I0307 21:44:55.145331 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.145499 master-0 kubenswrapper[16352]: I0307 21:44:55.145411 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.145702 master-0 kubenswrapper[16352]: I0307 21:44:55.145624 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5qxqz\" (UniqueName: \"kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.145811 master-0 kubenswrapper[16352]: I0307 21:44:55.145712 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.149961 master-0 kubenswrapper[16352]: I0307 21:44:55.149910 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.150370 master-0 kubenswrapper[16352]: I0307 21:44:55.150304 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.151341 master-0 kubenswrapper[16352]: I0307 21:44:55.151286 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.151936 master-0 kubenswrapper[16352]: I0307 21:44:55.151836 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.152029 master-0 kubenswrapper[16352]: I0307 21:44:55.151913 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.152504 master-0 kubenswrapper[16352]: I0307 21:44:55.152437 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.179776 master-0 kubenswrapper[16352]: I0307 21:44:55.179566 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-48cck\" (UniqueName: \"kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck\") pod \"nova-cell1-cell-mapping-8cwkr\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.180784 master-0 kubenswrapper[16352]: I0307 21:44:55.180719 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5qxqz\" (UniqueName: \"kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz\") pod \"nova-cell1-host-discover-8g65x\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.240238 master-0 kubenswrapper[16352]: I0307 21:44:55.240129 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:44:55.279002 master-0 kubenswrapper[16352]: I0307 21:44:55.278858 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:44:55.850739 master-0 kubenswrapper[16352]: I0307 21:44:55.850349 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-8cwkr"] Mar 07 21:44:55.851385 master-0 kubenswrapper[16352]: W0307 21:44:55.851178 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2efb703b_1fcb_4aef_b969_8b4afd7dc207.slice/crio-95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8 WatchSource:0}: Error finding container 95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8: Status 404 returned error can't find the container with id 95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8 Mar 07 21:44:55.971576 master-0 kubenswrapper[16352]: I0307 21:44:55.971499 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-host-discover-8g65x"] Mar 07 21:44:55.979697 master-0 kubenswrapper[16352]: W0307 21:44:55.979038 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2cdcd2d2_c8c0_40e0_8e36_8f9fe1af1b20.slice/crio-502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3 WatchSource:0}: Error finding container 502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3: Status 404 returned error can't find the container with id 502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3 Mar 07 21:44:56.718368 master-0 kubenswrapper[16352]: I0307 21:44:56.718236 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-8g65x" event={"ID":"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20","Type":"ContainerStarted","Data":"41fa12277d21a6dfe6be14c59d8ba3f97290786dd8fbba3281f16d944c8c69da"} Mar 07 21:44:56.718368 master-0 kubenswrapper[16352]: I0307 21:44:56.718330 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-8g65x" event={"ID":"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20","Type":"ContainerStarted","Data":"502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3"} Mar 07 21:44:56.731015 master-0 kubenswrapper[16352]: I0307 21:44:56.730622 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8cwkr" event={"ID":"2efb703b-1fcb-4aef-b969-8b4afd7dc207","Type":"ContainerStarted","Data":"0fcbcd3425ac07f254302b3f7f585f9e11789071a133657f62c3151499244b38"} Mar 07 21:44:56.731015 master-0 kubenswrapper[16352]: I0307 21:44:56.730736 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8cwkr" event={"ID":"2efb703b-1fcb-4aef-b969-8b4afd7dc207","Type":"ContainerStarted","Data":"95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8"} Mar 07 21:44:56.749280 master-0 kubenswrapper[16352]: I0307 21:44:56.749123 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-host-discover-8g65x" podStartSLOduration=2.749083732 podStartE2EDuration="2.749083732s" podCreationTimestamp="2026-03-07 21:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:56.74522259 +0000 UTC m=+1619.815927729" watchObservedRunningTime="2026-03-07 21:44:56.749083732 +0000 UTC m=+1619.819788861" Mar 07 21:44:56.834383 master-0 kubenswrapper[16352]: I0307 21:44:56.834276 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-8cwkr" podStartSLOduration=2.8342506480000003 podStartE2EDuration="2.834250648s" podCreationTimestamp="2026-03-07 21:44:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:56.777846633 +0000 UTC m=+1619.848551722" watchObservedRunningTime="2026-03-07 21:44:56.834250648 +0000 UTC m=+1619.904955707" Mar 07 21:44:57.373732 master-0 kubenswrapper[16352]: I0307 21:44:57.373654 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:57.457776 master-0 kubenswrapper[16352]: I0307 21:44:57.457714 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data\") pod \"6a083cbc-d960-42c1-841f-8c0a8b262f87\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " Mar 07 21:44:57.458239 master-0 kubenswrapper[16352]: I0307 21:44:57.458206 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lrqm6\" (UniqueName: \"kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6\") pod \"6a083cbc-d960-42c1-841f-8c0a8b262f87\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " Mar 07 21:44:57.458446 master-0 kubenswrapper[16352]: I0307 21:44:57.458425 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs\") pod \"6a083cbc-d960-42c1-841f-8c0a8b262f87\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " Mar 07 21:44:57.458732 master-0 kubenswrapper[16352]: I0307 21:44:57.458710 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle\") pod \"6a083cbc-d960-42c1-841f-8c0a8b262f87\" (UID: \"6a083cbc-d960-42c1-841f-8c0a8b262f87\") " Mar 07 21:44:57.464751 master-0 kubenswrapper[16352]: I0307 21:44:57.464554 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs" (OuterVolumeSpecName: "logs") pod "6a083cbc-d960-42c1-841f-8c0a8b262f87" (UID: "6a083cbc-d960-42c1-841f-8c0a8b262f87"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:44:57.471538 master-0 kubenswrapper[16352]: I0307 21:44:57.470272 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6" (OuterVolumeSpecName: "kube-api-access-lrqm6") pod "6a083cbc-d960-42c1-841f-8c0a8b262f87" (UID: "6a083cbc-d960-42c1-841f-8c0a8b262f87"). InnerVolumeSpecName "kube-api-access-lrqm6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:44:57.530423 master-0 kubenswrapper[16352]: I0307 21:44:57.530368 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data" (OuterVolumeSpecName: "config-data") pod "6a083cbc-d960-42c1-841f-8c0a8b262f87" (UID: "6a083cbc-d960-42c1-841f-8c0a8b262f87"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:57.533130 master-0 kubenswrapper[16352]: I0307 21:44:57.533106 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "6a083cbc-d960-42c1-841f-8c0a8b262f87" (UID: "6a083cbc-d960-42c1-841f-8c0a8b262f87"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:44:57.586900 master-0 kubenswrapper[16352]: I0307 21:44:57.586818 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:57.586900 master-0 kubenswrapper[16352]: I0307 21:44:57.586901 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lrqm6\" (UniqueName: \"kubernetes.io/projected/6a083cbc-d960-42c1-841f-8c0a8b262f87-kube-api-access-lrqm6\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:57.587206 master-0 kubenswrapper[16352]: I0307 21:44:57.586921 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6a083cbc-d960-42c1-841f-8c0a8b262f87-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:57.587206 master-0 kubenswrapper[16352]: I0307 21:44:57.586965 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/6a083cbc-d960-42c1-841f-8c0a8b262f87-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:44:57.747653 master-0 kubenswrapper[16352]: I0307 21:44:57.747419 16352 generic.go:334] "Generic (PLEG): container finished" podID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerID="15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7" exitCode=0 Mar 07 21:44:57.747937 master-0 kubenswrapper[16352]: I0307 21:44:57.747530 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:57.749608 master-0 kubenswrapper[16352]: I0307 21:44:57.747531 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerDied","Data":"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7"} Mar 07 21:44:57.749856 master-0 kubenswrapper[16352]: I0307 21:44:57.749793 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"6a083cbc-d960-42c1-841f-8c0a8b262f87","Type":"ContainerDied","Data":"07114260e12516bce67b48f1c7a1928971eedd67163dbd4df2b2446b4b3e5982"} Mar 07 21:44:57.749911 master-0 kubenswrapper[16352]: I0307 21:44:57.749885 16352 scope.go:117] "RemoveContainer" containerID="15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7" Mar 07 21:44:57.781821 master-0 kubenswrapper[16352]: I0307 21:44:57.781772 16352 scope.go:117] "RemoveContainer" containerID="a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d" Mar 07 21:44:57.806273 master-0 kubenswrapper[16352]: I0307 21:44:57.806199 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:57.823729 master-0 kubenswrapper[16352]: I0307 21:44:57.823514 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:57.852218 master-0 kubenswrapper[16352]: I0307 21:44:57.852075 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:57.853160 master-0 kubenswrapper[16352]: E0307 21:44:57.853115 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-log" Mar 07 21:44:57.853160 master-0 kubenswrapper[16352]: I0307 21:44:57.853147 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-log" Mar 07 21:44:57.853272 master-0 kubenswrapper[16352]: E0307 21:44:57.853236 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-api" Mar 07 21:44:57.853272 master-0 kubenswrapper[16352]: I0307 21:44:57.853270 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-api" Mar 07 21:44:57.853775 master-0 kubenswrapper[16352]: I0307 21:44:57.853738 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-log" Mar 07 21:44:57.853872 master-0 kubenswrapper[16352]: I0307 21:44:57.853834 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" containerName="nova-api-api" Mar 07 21:44:57.853914 master-0 kubenswrapper[16352]: I0307 21:44:57.853753 16352 scope.go:117] "RemoveContainer" containerID="15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7" Mar 07 21:44:57.854591 master-0 kubenswrapper[16352]: E0307 21:44:57.854536 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7\": container with ID starting with 15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7 not found: ID does not exist" containerID="15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7" Mar 07 21:44:57.854655 master-0 kubenswrapper[16352]: I0307 21:44:57.854603 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7"} err="failed to get container status \"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7\": rpc error: code = NotFound desc = could not find container \"15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7\": container with ID starting with 15552f42e425d2c2b12cbba7aba6fe3daeb9a0b0c056ada633ba83a903a7b5d7 not found: ID does not exist" Mar 07 21:44:57.854655 master-0 kubenswrapper[16352]: I0307 21:44:57.854641 16352 scope.go:117] "RemoveContainer" containerID="a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d" Mar 07 21:44:57.855031 master-0 kubenswrapper[16352]: E0307 21:44:57.854982 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d\": container with ID starting with a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d not found: ID does not exist" containerID="a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d" Mar 07 21:44:57.855107 master-0 kubenswrapper[16352]: I0307 21:44:57.855028 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d"} err="failed to get container status \"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d\": rpc error: code = NotFound desc = could not find container \"a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d\": container with ID starting with a8b8ed399cb469ec328d3f7af060d6cb27050655a45ddd6ac22d1f434b3c939d not found: ID does not exist" Mar 07 21:44:57.857186 master-0 kubenswrapper[16352]: I0307 21:44:57.857149 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:57.862612 master-0 kubenswrapper[16352]: I0307 21:44:57.862551 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 21:44:57.862861 master-0 kubenswrapper[16352]: I0307 21:44:57.862826 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 07 21:44:57.862911 master-0 kubenswrapper[16352]: I0307 21:44:57.862860 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 07 21:44:57.872781 master-0 kubenswrapper[16352]: I0307 21:44:57.872586 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:58.002325 master-0 kubenswrapper[16352]: I0307 21:44:58.002085 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.002325 master-0 kubenswrapper[16352]: I0307 21:44:58.002295 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.002785 master-0 kubenswrapper[16352]: I0307 21:44:58.002411 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.003141 master-0 kubenswrapper[16352]: I0307 21:44:58.003103 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.003502 master-0 kubenswrapper[16352]: I0307 21:44:58.003424 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9cxw4\" (UniqueName: \"kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.003874 master-0 kubenswrapper[16352]: I0307 21:44:58.003842 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.107536 master-0 kubenswrapper[16352]: I0307 21:44:58.107451 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.107887 master-0 kubenswrapper[16352]: I0307 21:44:58.107790 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9cxw4\" (UniqueName: \"kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.108075 master-0 kubenswrapper[16352]: I0307 21:44:58.108040 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.108470 master-0 kubenswrapper[16352]: I0307 21:44:58.108434 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.108550 master-0 kubenswrapper[16352]: I0307 21:44:58.108489 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.109187 master-0 kubenswrapper[16352]: I0307 21:44:58.109143 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.109282 master-0 kubenswrapper[16352]: I0307 21:44:58.108592 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.122039 master-0 kubenswrapper[16352]: I0307 21:44:58.112447 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.122039 master-0 kubenswrapper[16352]: I0307 21:44:58.112761 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.122039 master-0 kubenswrapper[16352]: I0307 21:44:58.112870 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.122039 master-0 kubenswrapper[16352]: I0307 21:44:58.113274 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.137148 master-0 kubenswrapper[16352]: I0307 21:44:58.137077 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9cxw4\" (UniqueName: \"kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4\") pod \"nova-api-0\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " pod="openstack/nova-api-0" Mar 07 21:44:58.208407 master-0 kubenswrapper[16352]: I0307 21:44:58.208282 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:44:58.774455 master-0 kubenswrapper[16352]: W0307 21:44:58.774369 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod520d784b_a5ce_4bac_9f4f_3c37ce5b3a63.slice/crio-9b564ca08caf6721b06f21860c456ba73114f33e33a7485a6468b51f13c49dce WatchSource:0}: Error finding container 9b564ca08caf6721b06f21860c456ba73114f33e33a7485a6468b51f13c49dce: Status 404 returned error can't find the container with id 9b564ca08caf6721b06f21860c456ba73114f33e33a7485a6468b51f13c49dce Mar 07 21:44:58.790673 master-0 kubenswrapper[16352]: I0307 21:44:58.790614 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:44:59.234594 master-0 kubenswrapper[16352]: I0307 21:44:59.234508 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6a083cbc-d960-42c1-841f-8c0a8b262f87" path="/var/lib/kubelet/pods/6a083cbc-d960-42c1-841f-8c0a8b262f87/volumes" Mar 07 21:44:59.795941 master-0 kubenswrapper[16352]: I0307 21:44:59.795218 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerStarted","Data":"f4bdd022fa3e5a7476fee44cbce158931932e7c77a69f68993bad8d6f28073cb"} Mar 07 21:44:59.797220 master-0 kubenswrapper[16352]: I0307 21:44:59.797191 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerStarted","Data":"0883e502e66cddd496364a1ad1b27504225461e3a0e99e409e1ee8c965c19b3c"} Mar 07 21:44:59.797432 master-0 kubenswrapper[16352]: I0307 21:44:59.797386 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerStarted","Data":"9b564ca08caf6721b06f21860c456ba73114f33e33a7485a6468b51f13c49dce"} Mar 07 21:44:59.797622 master-0 kubenswrapper[16352]: I0307 21:44:59.797573 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-8g65x" event={"ID":"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20","Type":"ContainerDied","Data":"41fa12277d21a6dfe6be14c59d8ba3f97290786dd8fbba3281f16d944c8c69da"} Mar 07 21:44:59.797899 master-0 kubenswrapper[16352]: I0307 21:44:59.797457 16352 generic.go:334] "Generic (PLEG): container finished" podID="2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" containerID="41fa12277d21a6dfe6be14c59d8ba3f97290786dd8fbba3281f16d944c8c69da" exitCode=0 Mar 07 21:44:59.823078 master-0 kubenswrapper[16352]: I0307 21:44:59.822945 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.822925992 podStartE2EDuration="2.822925992s" podCreationTimestamp="2026-03-07 21:44:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:44:59.821106038 +0000 UTC m=+1622.891811107" watchObservedRunningTime="2026-03-07 21:44:59.822925992 +0000 UTC m=+1622.893631051" Mar 07 21:44:59.855996 master-0 kubenswrapper[16352]: E0307 21:44:59.855925 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:01.208291 master-0 kubenswrapper[16352]: I0307 21:45:01.208227 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5cc8bb4897-sws9x" Mar 07 21:45:01.333812 master-0 kubenswrapper[16352]: I0307 21:45:01.333743 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:45:01.334450 master-0 kubenswrapper[16352]: I0307 21:45:01.334414 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="dnsmasq-dns" containerID="cri-o://4a83490f6a2c30c916a511e48f131f32ce8ba7893ec155ff77a68c927c30c797" gracePeriod=10 Mar 07 21:45:01.553539 master-0 kubenswrapper[16352]: I0307 21:45:01.553462 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:45:01.681945 master-0 kubenswrapper[16352]: I0307 21:45:01.680719 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts\") pod \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " Mar 07 21:45:01.681945 master-0 kubenswrapper[16352]: I0307 21:45:01.681327 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5qxqz\" (UniqueName: \"kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz\") pod \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " Mar 07 21:45:01.681945 master-0 kubenswrapper[16352]: I0307 21:45:01.681377 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data\") pod \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " Mar 07 21:45:01.681945 master-0 kubenswrapper[16352]: I0307 21:45:01.681468 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle\") pod \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\" (UID: \"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20\") " Mar 07 21:45:01.700363 master-0 kubenswrapper[16352]: I0307 21:45:01.700272 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz" (OuterVolumeSpecName: "kube-api-access-5qxqz") pod "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" (UID: "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20"). InnerVolumeSpecName "kube-api-access-5qxqz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:01.701338 master-0 kubenswrapper[16352]: I0307 21:45:01.701242 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts" (OuterVolumeSpecName: "scripts") pod "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" (UID: "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:01.719054 master-0 kubenswrapper[16352]: I0307 21:45:01.718988 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data" (OuterVolumeSpecName: "config-data") pod "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" (UID: "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:01.791148 master-0 kubenswrapper[16352]: I0307 21:45:01.786501 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" (UID: "2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:01.791148 master-0 kubenswrapper[16352]: I0307 21:45:01.789118 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5qxqz\" (UniqueName: \"kubernetes.io/projected/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-kube-api-access-5qxqz\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:01.791148 master-0 kubenswrapper[16352]: I0307 21:45:01.789172 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:01.791148 master-0 kubenswrapper[16352]: I0307 21:45:01.789188 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:01.791148 master-0 kubenswrapper[16352]: I0307 21:45:01.789202 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:01.876223 master-0 kubenswrapper[16352]: I0307 21:45:01.872654 16352 generic.go:334] "Generic (PLEG): container finished" podID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerID="4a83490f6a2c30c916a511e48f131f32ce8ba7893ec155ff77a68c927c30c797" exitCode=0 Mar 07 21:45:01.876223 master-0 kubenswrapper[16352]: I0307 21:45:01.872784 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" event={"ID":"f4bc275d-a9ea-41ac-840a-f9954b05742c","Type":"ContainerDied","Data":"4a83490f6a2c30c916a511e48f131f32ce8ba7893ec155ff77a68c927c30c797"} Mar 07 21:45:01.876223 master-0 kubenswrapper[16352]: I0307 21:45:01.875863 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-host-discover-8g65x" event={"ID":"2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20","Type":"ContainerDied","Data":"502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3"} Mar 07 21:45:01.876223 master-0 kubenswrapper[16352]: I0307 21:45:01.875886 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="502f7e39c324b8fa58a31a8f53df4ee2057e289f53ef234668f9f50a0cfa48e3" Mar 07 21:45:01.876223 master-0 kubenswrapper[16352]: I0307 21:45:01.875952 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-host-discover-8g65x" Mar 07 21:45:01.878374 master-0 kubenswrapper[16352]: I0307 21:45:01.878308 16352 generic.go:334] "Generic (PLEG): container finished" podID="2efb703b-1fcb-4aef-b969-8b4afd7dc207" containerID="0fcbcd3425ac07f254302b3f7f585f9e11789071a133657f62c3151499244b38" exitCode=0 Mar 07 21:45:01.879635 master-0 kubenswrapper[16352]: I0307 21:45:01.878800 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8cwkr" event={"ID":"2efb703b-1fcb-4aef-b969-8b4afd7dc207","Type":"ContainerDied","Data":"0fcbcd3425ac07f254302b3f7f585f9e11789071a133657f62c3151499244b38"} Mar 07 21:45:01.985220 master-0 kubenswrapper[16352]: I0307 21:45:01.985146 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:45:02.098238 master-0 kubenswrapper[16352]: I0307 21:45:02.098174 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.098915 master-0 kubenswrapper[16352]: I0307 21:45:02.098888 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.099024 master-0 kubenswrapper[16352]: I0307 21:45:02.098917 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mbx4\" (UniqueName: \"kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.099024 master-0 kubenswrapper[16352]: I0307 21:45:02.098986 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.099117 master-0 kubenswrapper[16352]: I0307 21:45:02.099060 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.099235 master-0 kubenswrapper[16352]: I0307 21:45:02.099221 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb\") pod \"f4bc275d-a9ea-41ac-840a-f9954b05742c\" (UID: \"f4bc275d-a9ea-41ac-840a-f9954b05742c\") " Mar 07 21:45:02.103446 master-0 kubenswrapper[16352]: I0307 21:45:02.103362 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4" (OuterVolumeSpecName: "kube-api-access-9mbx4") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "kube-api-access-9mbx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:02.165553 master-0 kubenswrapper[16352]: I0307 21:45:02.165456 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:02.167390 master-0 kubenswrapper[16352]: I0307 21:45:02.167348 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:02.170472 master-0 kubenswrapper[16352]: I0307 21:45:02.170398 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:02.173712 master-0 kubenswrapper[16352]: I0307 21:45:02.173521 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config" (OuterVolumeSpecName: "config") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:02.173817 master-0 kubenswrapper[16352]: I0307 21:45:02.173755 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "f4bc275d-a9ea-41ac-840a-f9954b05742c" (UID: "f4bc275d-a9ea-41ac-840a-f9954b05742c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202455 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-sb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202535 16352 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-ovsdbserver-nb\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202559 16352 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-swift-storage-0\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202585 16352 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202817 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9mbx4\" (UniqueName: \"kubernetes.io/projected/f4bc275d-a9ea-41ac-840a-f9954b05742c-kube-api-access-9mbx4\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.207765 master-0 kubenswrapper[16352]: I0307 21:45:02.202846 16352 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f4bc275d-a9ea-41ac-840a-f9954b05742c-dns-svc\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:02.900989 master-0 kubenswrapper[16352]: I0307 21:45:02.900920 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" Mar 07 21:45:02.900989 master-0 kubenswrapper[16352]: I0307 21:45:02.900917 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8459745b77-pkh7k" event={"ID":"f4bc275d-a9ea-41ac-840a-f9954b05742c","Type":"ContainerDied","Data":"dec83259d2ac6258b0daf9530dda83637c1ab1ebf153a53483b971b6cc9d4672"} Mar 07 21:45:02.902170 master-0 kubenswrapper[16352]: I0307 21:45:02.901041 16352 scope.go:117] "RemoveContainer" containerID="4a83490f6a2c30c916a511e48f131f32ce8ba7893ec155ff77a68c927c30c797" Mar 07 21:45:02.955080 master-0 kubenswrapper[16352]: I0307 21:45:02.955013 16352 scope.go:117] "RemoveContainer" containerID="c4d81c90e2f0bee9707d97ca52bcbf6a09a0e7806ca42779a4c6ba4ac79f3046" Mar 07 21:45:02.987033 master-0 kubenswrapper[16352]: I0307 21:45:02.986924 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:45:03.004838 master-0 kubenswrapper[16352]: I0307 21:45:03.004729 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8459745b77-pkh7k"] Mar 07 21:45:03.214730 master-0 kubenswrapper[16352]: I0307 21:45:03.214510 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" path="/var/lib/kubelet/pods/f4bc275d-a9ea-41ac-840a-f9954b05742c/volumes" Mar 07 21:45:03.488062 master-0 kubenswrapper[16352]: I0307 21:45:03.487913 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:45:03.561479 master-0 kubenswrapper[16352]: I0307 21:45:03.559134 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data\") pod \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " Mar 07 21:45:03.561479 master-0 kubenswrapper[16352]: I0307 21:45:03.559606 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts\") pod \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " Mar 07 21:45:03.561479 master-0 kubenswrapper[16352]: I0307 21:45:03.559745 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle\") pod \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " Mar 07 21:45:03.561479 master-0 kubenswrapper[16352]: I0307 21:45:03.559966 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-48cck\" (UniqueName: \"kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck\") pod \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\" (UID: \"2efb703b-1fcb-4aef-b969-8b4afd7dc207\") " Mar 07 21:45:03.565109 master-0 kubenswrapper[16352]: I0307 21:45:03.565070 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck" (OuterVolumeSpecName: "kube-api-access-48cck") pod "2efb703b-1fcb-4aef-b969-8b4afd7dc207" (UID: "2efb703b-1fcb-4aef-b969-8b4afd7dc207"). InnerVolumeSpecName "kube-api-access-48cck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:03.568036 master-0 kubenswrapper[16352]: I0307 21:45:03.567970 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts" (OuterVolumeSpecName: "scripts") pod "2efb703b-1fcb-4aef-b969-8b4afd7dc207" (UID: "2efb703b-1fcb-4aef-b969-8b4afd7dc207"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:03.613969 master-0 kubenswrapper[16352]: I0307 21:45:03.613883 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data" (OuterVolumeSpecName: "config-data") pod "2efb703b-1fcb-4aef-b969-8b4afd7dc207" (UID: "2efb703b-1fcb-4aef-b969-8b4afd7dc207"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:03.626628 master-0 kubenswrapper[16352]: I0307 21:45:03.626555 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2efb703b-1fcb-4aef-b969-8b4afd7dc207" (UID: "2efb703b-1fcb-4aef-b969-8b4afd7dc207"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:03.663729 master-0 kubenswrapper[16352]: I0307 21:45:03.663639 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:03.663729 master-0 kubenswrapper[16352]: I0307 21:45:03.663714 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-48cck\" (UniqueName: \"kubernetes.io/projected/2efb703b-1fcb-4aef-b969-8b4afd7dc207-kube-api-access-48cck\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:03.663729 master-0 kubenswrapper[16352]: I0307 21:45:03.663728 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:03.663729 master-0 kubenswrapper[16352]: I0307 21:45:03.663737 16352 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2efb703b-1fcb-4aef-b969-8b4afd7dc207-scripts\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:03.921385 master-0 kubenswrapper[16352]: I0307 21:45:03.921298 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-8cwkr" event={"ID":"2efb703b-1fcb-4aef-b969-8b4afd7dc207","Type":"ContainerDied","Data":"95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8"} Mar 07 21:45:03.921385 master-0 kubenswrapper[16352]: I0307 21:45:03.921377 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ce90ea9921c70f4d20258bfeaa846246c9f2c2d9e158670e1d68cd6d479ff8" Mar 07 21:45:03.922181 master-0 kubenswrapper[16352]: I0307 21:45:03.921414 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-8cwkr" Mar 07 21:45:04.230568 master-0 kubenswrapper[16352]: I0307 21:45:04.230363 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:04.230891 master-0 kubenswrapper[16352]: I0307 21:45:04.230740 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-log" containerID="cri-o://0883e502e66cddd496364a1ad1b27504225461e3a0e99e409e1ee8c965c19b3c" gracePeriod=30 Mar 07 21:45:04.231156 master-0 kubenswrapper[16352]: I0307 21:45:04.231085 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-api" containerID="cri-o://f4bdd022fa3e5a7476fee44cbce158931932e7c77a69f68993bad8d6f28073cb" gracePeriod=30 Mar 07 21:45:04.283258 master-0 kubenswrapper[16352]: I0307 21:45:04.283173 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:04.283714 master-0 kubenswrapper[16352]: I0307 21:45:04.283642 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" containerID="cri-o://e1e87fade9c512cebf40ef76d354b9544a5dd6710efebf3df25c1d874089ecab" gracePeriod=30 Mar 07 21:45:04.283899 master-0 kubenswrapper[16352]: I0307 21:45:04.283696 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" containerID="cri-o://6be1d1f530d0c7bdbc0d1c8090301072dfb8782fe72d092e1f23a3f4350c5f5c" gracePeriod=30 Mar 07 21:45:04.300297 master-0 kubenswrapper[16352]: I0307 21:45:04.300077 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:04.300484 master-0 kubenswrapper[16352]: I0307 21:45:04.300338 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerName="nova-scheduler-scheduler" containerID="cri-o://df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" gracePeriod=30 Mar 07 21:45:04.955036 master-0 kubenswrapper[16352]: I0307 21:45:04.953771 16352 generic.go:334] "Generic (PLEG): container finished" podID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerID="f4bdd022fa3e5a7476fee44cbce158931932e7c77a69f68993bad8d6f28073cb" exitCode=0 Mar 07 21:45:04.955036 master-0 kubenswrapper[16352]: I0307 21:45:04.953821 16352 generic.go:334] "Generic (PLEG): container finished" podID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerID="0883e502e66cddd496364a1ad1b27504225461e3a0e99e409e1ee8c965c19b3c" exitCode=143 Mar 07 21:45:04.955036 master-0 kubenswrapper[16352]: I0307 21:45:04.953876 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerDied","Data":"f4bdd022fa3e5a7476fee44cbce158931932e7c77a69f68993bad8d6f28073cb"} Mar 07 21:45:04.955036 master-0 kubenswrapper[16352]: I0307 21:45:04.953908 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerDied","Data":"0883e502e66cddd496364a1ad1b27504225461e3a0e99e409e1ee8c965c19b3c"} Mar 07 21:45:04.956955 master-0 kubenswrapper[16352]: I0307 21:45:04.956491 16352 generic.go:334] "Generic (PLEG): container finished" podID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerID="e1e87fade9c512cebf40ef76d354b9544a5dd6710efebf3df25c1d874089ecab" exitCode=143 Mar 07 21:45:04.956955 master-0 kubenswrapper[16352]: I0307 21:45:04.956531 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerDied","Data":"e1e87fade9c512cebf40ef76d354b9544a5dd6710efebf3df25c1d874089ecab"} Mar 07 21:45:05.259549 master-0 kubenswrapper[16352]: I0307 21:45:05.259362 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:45:05.305713 master-0 kubenswrapper[16352]: I0307 21:45:05.305608 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.306074 master-0 kubenswrapper[16352]: I0307 21:45:05.305823 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9cxw4\" (UniqueName: \"kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.306074 master-0 kubenswrapper[16352]: I0307 21:45:05.305952 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.306074 master-0 kubenswrapper[16352]: I0307 21:45:05.305985 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs" (OuterVolumeSpecName: "logs") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:45:05.306074 master-0 kubenswrapper[16352]: I0307 21:45:05.306050 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.306279 master-0 kubenswrapper[16352]: I0307 21:45:05.306135 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.306537 master-0 kubenswrapper[16352]: I0307 21:45:05.306502 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs\") pod \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\" (UID: \"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63\") " Mar 07 21:45:05.308025 master-0 kubenswrapper[16352]: I0307 21:45:05.307976 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.315803 master-0 kubenswrapper[16352]: I0307 21:45:05.315472 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4" (OuterVolumeSpecName: "kube-api-access-9cxw4") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "kube-api-access-9cxw4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:05.340469 master-0 kubenswrapper[16352]: I0307 21:45:05.340377 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data" (OuterVolumeSpecName: "config-data") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:05.367953 master-0 kubenswrapper[16352]: I0307 21:45:05.367886 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:05.378942 master-0 kubenswrapper[16352]: I0307 21:45:05.378878 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:05.396411 master-0 kubenswrapper[16352]: I0307 21:45:05.396322 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" (UID: "520d784b-a5ce-4bac-9f4f-3c37ce5b3a63"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:05.410820 master-0 kubenswrapper[16352]: I0307 21:45:05.410744 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9cxw4\" (UniqueName: \"kubernetes.io/projected/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-kube-api-access-9cxw4\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.410943 master-0 kubenswrapper[16352]: I0307 21:45:05.410917 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.410992 master-0 kubenswrapper[16352]: I0307 21:45:05.410941 16352 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-public-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.411096 master-0 kubenswrapper[16352]: I0307 21:45:05.411046 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.411096 master-0 kubenswrapper[16352]: I0307 21:45:05.411089 16352 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63-internal-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:05.572811 master-0 kubenswrapper[16352]: E0307 21:45:05.570419 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:45:05.574691 master-0 kubenswrapper[16352]: E0307 21:45:05.574592 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:45:05.594259 master-0 kubenswrapper[16352]: E0307 21:45:05.594145 16352 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 07 21:45:05.594578 master-0 kubenswrapper[16352]: E0307 21:45:05.594275 16352 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerName="nova-scheduler-scheduler" Mar 07 21:45:05.845770 master-0 kubenswrapper[16352]: E0307 21:45:05.845510 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:05.975048 master-0 kubenswrapper[16352]: I0307 21:45:05.974905 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"520d784b-a5ce-4bac-9f4f-3c37ce5b3a63","Type":"ContainerDied","Data":"9b564ca08caf6721b06f21860c456ba73114f33e33a7485a6468b51f13c49dce"} Mar 07 21:45:05.975048 master-0 kubenswrapper[16352]: I0307 21:45:05.975002 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:45:05.975048 master-0 kubenswrapper[16352]: I0307 21:45:05.975018 16352 scope.go:117] "RemoveContainer" containerID="f4bdd022fa3e5a7476fee44cbce158931932e7c77a69f68993bad8d6f28073cb" Mar 07 21:45:06.012494 master-0 kubenswrapper[16352]: I0307 21:45:06.012180 16352 scope.go:117] "RemoveContainer" containerID="0883e502e66cddd496364a1ad1b27504225461e3a0e99e409e1ee8c965c19b3c" Mar 07 21:45:06.061121 master-0 kubenswrapper[16352]: I0307 21:45:06.061021 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:06.090901 master-0 kubenswrapper[16352]: I0307 21:45:06.088433 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:06.116122 master-0 kubenswrapper[16352]: I0307 21:45:06.115954 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:06.116925 master-0 kubenswrapper[16352]: E0307 21:45:06.116882 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="init" Mar 07 21:45:06.116925 master-0 kubenswrapper[16352]: I0307 21:45:06.116918 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="init" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: E0307 21:45:06.116961 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="dnsmasq-dns" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: I0307 21:45:06.116971 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="dnsmasq-dns" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: E0307 21:45:06.117021 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" containerName="nova-manage" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: I0307 21:45:06.117032 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" containerName="nova-manage" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: E0307 21:45:06.117047 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-log" Mar 07 21:45:06.117055 master-0 kubenswrapper[16352]: I0307 21:45:06.117056 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-log" Mar 07 21:45:06.117304 master-0 kubenswrapper[16352]: E0307 21:45:06.117080 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2efb703b-1fcb-4aef-b969-8b4afd7dc207" containerName="nova-manage" Mar 07 21:45:06.117304 master-0 kubenswrapper[16352]: I0307 21:45:06.117091 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="2efb703b-1fcb-4aef-b969-8b4afd7dc207" containerName="nova-manage" Mar 07 21:45:06.117304 master-0 kubenswrapper[16352]: E0307 21:45:06.117119 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-api" Mar 07 21:45:06.117304 master-0 kubenswrapper[16352]: I0307 21:45:06.117127 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-api" Mar 07 21:45:06.117476 master-0 kubenswrapper[16352]: I0307 21:45:06.117460 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-log" Mar 07 21:45:06.117544 master-0 kubenswrapper[16352]: I0307 21:45:06.117480 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4bc275d-a9ea-41ac-840a-f9954b05742c" containerName="dnsmasq-dns" Mar 07 21:45:06.117544 master-0 kubenswrapper[16352]: I0307 21:45:06.117498 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" containerName="nova-manage" Mar 07 21:45:06.117544 master-0 kubenswrapper[16352]: I0307 21:45:06.117536 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" containerName="nova-api-api" Mar 07 21:45:06.117706 master-0 kubenswrapper[16352]: I0307 21:45:06.117571 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="2efb703b-1fcb-4aef-b969-8b4afd7dc207" containerName="nova-manage" Mar 07 21:45:06.119613 master-0 kubenswrapper[16352]: I0307 21:45:06.119535 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:45:06.123086 master-0 kubenswrapper[16352]: I0307 21:45:06.123009 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 07 21:45:06.123186 master-0 kubenswrapper[16352]: I0307 21:45:06.123107 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 07 21:45:06.123527 master-0 kubenswrapper[16352]: I0307 21:45:06.123418 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 07 21:45:06.128578 master-0 kubenswrapper[16352]: I0307 21:45:06.128504 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:06.136173 master-0 kubenswrapper[16352]: I0307 21:45:06.136114 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-public-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.136475 master-0 kubenswrapper[16352]: I0307 21:45:06.136240 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-config-data\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.136552 master-0 kubenswrapper[16352]: I0307 21:45:06.136497 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.136823 master-0 kubenswrapper[16352]: I0307 21:45:06.136752 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-logs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.139372 master-0 kubenswrapper[16352]: I0307 21:45:06.136836 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-67t4w\" (UniqueName: \"kubernetes.io/projected/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-kube-api-access-67t4w\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.139372 master-0 kubenswrapper[16352]: I0307 21:45:06.136860 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.240853 master-0 kubenswrapper[16352]: I0307 21:45:06.240461 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-public-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.240853 master-0 kubenswrapper[16352]: I0307 21:45:06.240863 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-config-data\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.241309 master-0 kubenswrapper[16352]: I0307 21:45:06.241030 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.241456 master-0 kubenswrapper[16352]: I0307 21:45:06.241357 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-logs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.241799 master-0 kubenswrapper[16352]: I0307 21:45:06.241733 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-67t4w\" (UniqueName: \"kubernetes.io/projected/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-kube-api-access-67t4w\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.241976 master-0 kubenswrapper[16352]: I0307 21:45:06.241885 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.241976 master-0 kubenswrapper[16352]: I0307 21:45:06.241913 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-logs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.247484 master-0 kubenswrapper[16352]: I0307 21:45:06.247401 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-config-data\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.247928 master-0 kubenswrapper[16352]: I0307 21:45:06.247867 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.248193 master-0 kubenswrapper[16352]: I0307 21:45:06.248133 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-public-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.248375 master-0 kubenswrapper[16352]: I0307 21:45:06.248309 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-internal-tls-certs\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.271041 master-0 kubenswrapper[16352]: I0307 21:45:06.270957 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-67t4w\" (UniqueName: \"kubernetes.io/projected/20b6be5e-a0d3-491d-80d9-dfe0a69e19d6-kube-api-access-67t4w\") pod \"nova-api-0\" (UID: \"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6\") " pod="openstack/nova-api-0" Mar 07 21:45:06.443318 master-0 kubenswrapper[16352]: I0307 21:45:06.443014 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 07 21:45:07.037530 master-0 kubenswrapper[16352]: I0307 21:45:07.037472 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 07 21:45:07.222824 master-0 kubenswrapper[16352]: I0307 21:45:07.218975 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="520d784b-a5ce-4bac-9f4f-3c37ce5b3a63" path="/var/lib/kubelet/pods/520d784b-a5ce-4bac-9f4f-3c37ce5b3a63/volumes" Mar 07 21:45:07.440234 master-0 kubenswrapper[16352]: I0307 21:45:07.440115 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": read tcp 10.128.0.2:45006->10.128.1.8:8775: read: connection reset by peer" Mar 07 21:45:07.440234 master-0 kubenswrapper[16352]: I0307 21:45:07.440190 16352 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/nova-metadata-0" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.8:8775/\": read tcp 10.128.0.2:45012->10.128.1.8:8775: read: connection reset by peer" Mar 07 21:45:08.011173 master-0 kubenswrapper[16352]: I0307 21:45:08.011095 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6","Type":"ContainerStarted","Data":"efcef80ea63c2fb7da198b92e92d6bdca19b26a758ff88ccb1db61dc2eae2cb4"} Mar 07 21:45:08.011173 master-0 kubenswrapper[16352]: I0307 21:45:08.011171 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6","Type":"ContainerStarted","Data":"32a27dd67b1433156897bcc4db92826e5a4226b8f8e765b8745880594139f1cf"} Mar 07 21:45:08.011173 master-0 kubenswrapper[16352]: I0307 21:45:08.011185 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"20b6be5e-a0d3-491d-80d9-dfe0a69e19d6","Type":"ContainerStarted","Data":"62b2f4e3072bb72b130df9a97bec29d43a74ee711581724f8d04946d6997dcb0"} Mar 07 21:45:08.015276 master-0 kubenswrapper[16352]: I0307 21:45:08.015229 16352 generic.go:334] "Generic (PLEG): container finished" podID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerID="6be1d1f530d0c7bdbc0d1c8090301072dfb8782fe72d092e1f23a3f4350c5f5c" exitCode=0 Mar 07 21:45:08.015276 master-0 kubenswrapper[16352]: I0307 21:45:08.015274 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerDied","Data":"6be1d1f530d0c7bdbc0d1c8090301072dfb8782fe72d092e1f23a3f4350c5f5c"} Mar 07 21:45:08.015408 master-0 kubenswrapper[16352]: I0307 21:45:08.015296 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb","Type":"ContainerDied","Data":"d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1"} Mar 07 21:45:08.015408 master-0 kubenswrapper[16352]: I0307 21:45:08.015309 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3617f179844d894479ec8f8d4713749d5fd5b4f930d55c83cdd4ee0395145d1" Mar 07 21:45:08.019662 master-0 kubenswrapper[16352]: I0307 21:45:08.019642 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:45:08.092693 master-0 kubenswrapper[16352]: I0307 21:45:08.090993 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.090960495 podStartE2EDuration="2.090960495s" podCreationTimestamp="2026-03-07 21:45:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:45:08.03956015 +0000 UTC m=+1631.110265209" watchObservedRunningTime="2026-03-07 21:45:08.090960495 +0000 UTC m=+1631.161665554" Mar 07 21:45:08.222047 master-0 kubenswrapper[16352]: I0307 21:45:08.221569 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data\") pod \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " Mar 07 21:45:08.222349 master-0 kubenswrapper[16352]: I0307 21:45:08.222236 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs\") pod \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " Mar 07 21:45:08.222349 master-0 kubenswrapper[16352]: I0307 21:45:08.222333 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs\") pod \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " Mar 07 21:45:08.223503 master-0 kubenswrapper[16352]: I0307 21:45:08.223385 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle\") pod \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " Mar 07 21:45:08.223702 master-0 kubenswrapper[16352]: I0307 21:45:08.223440 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs" (OuterVolumeSpecName: "logs") pod "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" (UID: "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 07 21:45:08.223757 master-0 kubenswrapper[16352]: I0307 21:45:08.223647 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l47wg\" (UniqueName: \"kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg\") pod \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\" (UID: \"ae9bf672-4d9f-4146-92d2-1dbf658dfcbb\") " Mar 07 21:45:08.224795 master-0 kubenswrapper[16352]: I0307 21:45:08.224754 16352 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-logs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:08.232164 master-0 kubenswrapper[16352]: I0307 21:45:08.232077 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg" (OuterVolumeSpecName: "kube-api-access-l47wg") pod "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" (UID: "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb"). InnerVolumeSpecName "kube-api-access-l47wg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:08.274381 master-0 kubenswrapper[16352]: I0307 21:45:08.274311 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" (UID: "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:08.289410 master-0 kubenswrapper[16352]: I0307 21:45:08.289273 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" (UID: "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:08.289844 master-0 kubenswrapper[16352]: I0307 21:45:08.289597 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data" (OuterVolumeSpecName: "config-data") pod "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" (UID: "ae9bf672-4d9f-4146-92d2-1dbf658dfcbb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:08.328472 master-0 kubenswrapper[16352]: I0307 21:45:08.327787 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:08.328472 master-0 kubenswrapper[16352]: I0307 21:45:08.327860 16352 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-nova-metadata-tls-certs\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:08.328472 master-0 kubenswrapper[16352]: I0307 21:45:08.327884 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:08.328472 master-0 kubenswrapper[16352]: I0307 21:45:08.327896 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l47wg\" (UniqueName: \"kubernetes.io/projected/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb-kube-api-access-l47wg\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:09.030007 master-0 kubenswrapper[16352]: I0307 21:45:09.029917 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:45:09.092482 master-0 kubenswrapper[16352]: I0307 21:45:09.092400 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:09.104875 master-0 kubenswrapper[16352]: I0307 21:45:09.104808 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:09.130080 master-0 kubenswrapper[16352]: I0307 21:45:09.130000 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:09.130832 master-0 kubenswrapper[16352]: E0307 21:45:09.130802 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" Mar 07 21:45:09.130906 master-0 kubenswrapper[16352]: I0307 21:45:09.130833 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" Mar 07 21:45:09.130906 master-0 kubenswrapper[16352]: E0307 21:45:09.130881 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" Mar 07 21:45:09.130906 master-0 kubenswrapper[16352]: I0307 21:45:09.130890 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" Mar 07 21:45:09.131198 master-0 kubenswrapper[16352]: I0307 21:45:09.131177 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-metadata" Mar 07 21:45:09.131259 master-0 kubenswrapper[16352]: I0307 21:45:09.131249 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" containerName="nova-metadata-log" Mar 07 21:45:09.133144 master-0 kubenswrapper[16352]: I0307 21:45:09.133111 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:45:09.136396 master-0 kubenswrapper[16352]: I0307 21:45:09.136362 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 07 21:45:09.136791 master-0 kubenswrapper[16352]: I0307 21:45:09.136755 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 07 21:45:09.151950 master-0 kubenswrapper[16352]: I0307 21:45:09.151871 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:09.154019 master-0 kubenswrapper[16352]: I0307 21:45:09.153954 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.154589 master-0 kubenswrapper[16352]: I0307 21:45:09.154537 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-config-data\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.155224 master-0 kubenswrapper[16352]: I0307 21:45:09.155128 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a74b825-c8eb-4977-8524-80215b306e51-logs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.155413 master-0 kubenswrapper[16352]: I0307 21:45:09.155330 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sc77g\" (UniqueName: \"kubernetes.io/projected/0a74b825-c8eb-4977-8524-80215b306e51-kube-api-access-sc77g\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.155953 master-0 kubenswrapper[16352]: I0307 21:45:09.155788 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.204344 master-0 kubenswrapper[16352]: I0307 21:45:09.204257 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae9bf672-4d9f-4146-92d2-1dbf658dfcbb" path="/var/lib/kubelet/pods/ae9bf672-4d9f-4146-92d2-1dbf658dfcbb/volumes" Mar 07 21:45:09.258772 master-0 kubenswrapper[16352]: I0307 21:45:09.258606 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-config-data\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.258772 master-0 kubenswrapper[16352]: I0307 21:45:09.258780 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a74b825-c8eb-4977-8524-80215b306e51-logs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.259202 master-0 kubenswrapper[16352]: I0307 21:45:09.258821 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sc77g\" (UniqueName: \"kubernetes.io/projected/0a74b825-c8eb-4977-8524-80215b306e51-kube-api-access-sc77g\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.259202 master-0 kubenswrapper[16352]: I0307 21:45:09.258888 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.259450 master-0 kubenswrapper[16352]: I0307 21:45:09.259357 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.260183 master-0 kubenswrapper[16352]: I0307 21:45:09.260128 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0a74b825-c8eb-4977-8524-80215b306e51-logs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.264508 master-0 kubenswrapper[16352]: I0307 21:45:09.264460 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.265001 master-0 kubenswrapper[16352]: I0307 21:45:09.264933 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-config-data\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.273639 master-0 kubenswrapper[16352]: I0307 21:45:09.273562 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0a74b825-c8eb-4977-8524-80215b306e51-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.277538 master-0 kubenswrapper[16352]: I0307 21:45:09.277465 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sc77g\" (UniqueName: \"kubernetes.io/projected/0a74b825-c8eb-4977-8524-80215b306e51-kube-api-access-sc77g\") pod \"nova-metadata-0\" (UID: \"0a74b825-c8eb-4977-8524-80215b306e51\") " pod="openstack/nova-metadata-0" Mar 07 21:45:09.474890 master-0 kubenswrapper[16352]: I0307 21:45:09.474788 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 07 21:45:09.941461 master-0 kubenswrapper[16352]: E0307 21:45:09.941369 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:10.027792 master-0 kubenswrapper[16352]: W0307 21:45:10.027704 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0a74b825_c8eb_4977_8524_80215b306e51.slice/crio-8051f0873c7e7eb564db66b4529908b96f3a921d9b10cfcb534975ff045b8795 WatchSource:0}: Error finding container 8051f0873c7e7eb564db66b4529908b96f3a921d9b10cfcb534975ff045b8795: Status 404 returned error can't find the container with id 8051f0873c7e7eb564db66b4529908b96f3a921d9b10cfcb534975ff045b8795 Mar 07 21:45:10.038054 master-0 kubenswrapper[16352]: I0307 21:45:10.037981 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 07 21:45:10.047380 master-0 kubenswrapper[16352]: I0307 21:45:10.047279 16352 generic.go:334] "Generic (PLEG): container finished" podID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerID="df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" exitCode=0 Mar 07 21:45:10.047380 master-0 kubenswrapper[16352]: I0307 21:45:10.047349 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"acd4977f-05c9-482d-8c7b-4178e7ceb659","Type":"ContainerDied","Data":"df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65"} Mar 07 21:45:10.049053 master-0 kubenswrapper[16352]: I0307 21:45:10.048985 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a74b825-c8eb-4977-8524-80215b306e51","Type":"ContainerStarted","Data":"8051f0873c7e7eb564db66b4529908b96f3a921d9b10cfcb534975ff045b8795"} Mar 07 21:45:10.133146 master-0 kubenswrapper[16352]: I0307 21:45:10.133068 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:45:10.194511 master-0 kubenswrapper[16352]: I0307 21:45:10.194410 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ncb2\" (UniqueName: \"kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2\") pod \"acd4977f-05c9-482d-8c7b-4178e7ceb659\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " Mar 07 21:45:10.194909 master-0 kubenswrapper[16352]: I0307 21:45:10.194723 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle\") pod \"acd4977f-05c9-482d-8c7b-4178e7ceb659\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " Mar 07 21:45:10.194909 master-0 kubenswrapper[16352]: I0307 21:45:10.194797 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data\") pod \"acd4977f-05c9-482d-8c7b-4178e7ceb659\" (UID: \"acd4977f-05c9-482d-8c7b-4178e7ceb659\") " Mar 07 21:45:10.205765 master-0 kubenswrapper[16352]: I0307 21:45:10.204130 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2" (OuterVolumeSpecName: "kube-api-access-4ncb2") pod "acd4977f-05c9-482d-8c7b-4178e7ceb659" (UID: "acd4977f-05c9-482d-8c7b-4178e7ceb659"). InnerVolumeSpecName "kube-api-access-4ncb2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:10.263325 master-0 kubenswrapper[16352]: I0307 21:45:10.263238 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "acd4977f-05c9-482d-8c7b-4178e7ceb659" (UID: "acd4977f-05c9-482d-8c7b-4178e7ceb659"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:10.278083 master-0 kubenswrapper[16352]: I0307 21:45:10.276188 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data" (OuterVolumeSpecName: "config-data") pod "acd4977f-05c9-482d-8c7b-4178e7ceb659" (UID: "acd4977f-05c9-482d-8c7b-4178e7ceb659"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:10.296889 master-0 kubenswrapper[16352]: I0307 21:45:10.296816 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:10.296889 master-0 kubenswrapper[16352]: I0307 21:45:10.296866 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/acd4977f-05c9-482d-8c7b-4178e7ceb659-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:10.296889 master-0 kubenswrapper[16352]: I0307 21:45:10.296881 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4ncb2\" (UniqueName: \"kubernetes.io/projected/acd4977f-05c9-482d-8c7b-4178e7ceb659-kube-api-access-4ncb2\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:11.068131 master-0 kubenswrapper[16352]: I0307 21:45:11.068041 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a74b825-c8eb-4977-8524-80215b306e51","Type":"ContainerStarted","Data":"00b69799626111adf3b427fd0a2cdae7841b825cea33f22528a6b93ee3167a19"} Mar 07 21:45:11.068131 master-0 kubenswrapper[16352]: I0307 21:45:11.068132 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"0a74b825-c8eb-4977-8524-80215b306e51","Type":"ContainerStarted","Data":"0bb75e8b3d0ab6b5ee17696f5053d897bb2e9311db9baa4ffa00fcdfddbe23d1"} Mar 07 21:45:11.071355 master-0 kubenswrapper[16352]: I0307 21:45:11.071304 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:45:11.072903 master-0 kubenswrapper[16352]: I0307 21:45:11.072844 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"acd4977f-05c9-482d-8c7b-4178e7ceb659","Type":"ContainerDied","Data":"98448138001ef0184066282442e86a6627a0e452a5fae53594f1d4ebca89ad4c"} Mar 07 21:45:11.073034 master-0 kubenswrapper[16352]: I0307 21:45:11.072928 16352 scope.go:117] "RemoveContainer" containerID="df411381e236438538be6d3bf7aab0f05f479b173b087f13896348c84ec0bd65" Mar 07 21:45:11.138051 master-0 kubenswrapper[16352]: I0307 21:45:11.136666 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.136622167 podStartE2EDuration="2.136622167s" podCreationTimestamp="2026-03-07 21:45:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:45:11.114422963 +0000 UTC m=+1634.185128022" watchObservedRunningTime="2026-03-07 21:45:11.136622167 +0000 UTC m=+1634.207327276" Mar 07 21:45:11.249647 master-0 kubenswrapper[16352]: I0307 21:45:11.249575 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:11.262462 master-0 kubenswrapper[16352]: I0307 21:45:11.262425 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:11.280158 master-0 kubenswrapper[16352]: I0307 21:45:11.280015 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:11.282508 master-0 kubenswrapper[16352]: E0307 21:45:11.282438 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerName="nova-scheduler-scheduler" Mar 07 21:45:11.282585 master-0 kubenswrapper[16352]: I0307 21:45:11.282516 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerName="nova-scheduler-scheduler" Mar 07 21:45:11.283299 master-0 kubenswrapper[16352]: I0307 21:45:11.283266 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" containerName="nova-scheduler-scheduler" Mar 07 21:45:11.285154 master-0 kubenswrapper[16352]: I0307 21:45:11.285117 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:45:11.289850 master-0 kubenswrapper[16352]: I0307 21:45:11.289794 16352 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 07 21:45:11.295499 master-0 kubenswrapper[16352]: I0307 21:45:11.295432 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:11.343390 master-0 kubenswrapper[16352]: I0307 21:45:11.343242 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dd8s\" (UniqueName: \"kubernetes.io/projected/e79dfbce-919e-4a56-9dca-c32cd640d84d-kube-api-access-2dd8s\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.343390 master-0 kubenswrapper[16352]: I0307 21:45:11.343347 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.344152 master-0 kubenswrapper[16352]: I0307 21:45:11.344041 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-config-data\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.447058 master-0 kubenswrapper[16352]: I0307 21:45:11.446986 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2dd8s\" (UniqueName: \"kubernetes.io/projected/e79dfbce-919e-4a56-9dca-c32cd640d84d-kube-api-access-2dd8s\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.447429 master-0 kubenswrapper[16352]: I0307 21:45:11.447399 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.447763 master-0 kubenswrapper[16352]: I0307 21:45:11.447739 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-config-data\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.453738 master-0 kubenswrapper[16352]: I0307 21:45:11.453677 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-config-data\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.455029 master-0 kubenswrapper[16352]: I0307 21:45:11.454946 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e79dfbce-919e-4a56-9dca-c32cd640d84d-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.470513 master-0 kubenswrapper[16352]: I0307 21:45:11.470455 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2dd8s\" (UniqueName: \"kubernetes.io/projected/e79dfbce-919e-4a56-9dca-c32cd640d84d-kube-api-access-2dd8s\") pod \"nova-scheduler-0\" (UID: \"e79dfbce-919e-4a56-9dca-c32cd640d84d\") " pod="openstack/nova-scheduler-0" Mar 07 21:45:11.629129 master-0 kubenswrapper[16352]: I0307 21:45:11.628971 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 07 21:45:12.320034 master-0 kubenswrapper[16352]: W0307 21:45:12.319967 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode79dfbce_919e_4a56_9dca_c32cd640d84d.slice/crio-a62010a6c67d851ea41fa283cabdc40f61ac48973318d5a0ffddca9b0ec9e244 WatchSource:0}: Error finding container a62010a6c67d851ea41fa283cabdc40f61ac48973318d5a0ffddca9b0ec9e244: Status 404 returned error can't find the container with id a62010a6c67d851ea41fa283cabdc40f61ac48973318d5a0ffddca9b0ec9e244 Mar 07 21:45:12.327150 master-0 kubenswrapper[16352]: I0307 21:45:12.327045 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 07 21:45:13.112444 master-0 kubenswrapper[16352]: I0307 21:45:13.112344 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e79dfbce-919e-4a56-9dca-c32cd640d84d","Type":"ContainerStarted","Data":"164045a3f368e00de0a738d8ab087b71d83324148ab2ffa3d28754caa8b7204d"} Mar 07 21:45:13.112444 master-0 kubenswrapper[16352]: I0307 21:45:13.112426 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"e79dfbce-919e-4a56-9dca-c32cd640d84d","Type":"ContainerStarted","Data":"a62010a6c67d851ea41fa283cabdc40f61ac48973318d5a0ffddca9b0ec9e244"} Mar 07 21:45:13.159447 master-0 kubenswrapper[16352]: I0307 21:45:13.159320 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.159291905 podStartE2EDuration="2.159291905s" podCreationTimestamp="2026-03-07 21:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:45:13.147612304 +0000 UTC m=+1636.218317373" watchObservedRunningTime="2026-03-07 21:45:13.159291905 +0000 UTC m=+1636.229996974" Mar 07 21:45:13.209725 master-0 kubenswrapper[16352]: I0307 21:45:13.209569 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acd4977f-05c9-482d-8c7b-4178e7ceb659" path="/var/lib/kubelet/pods/acd4977f-05c9-482d-8c7b-4178e7ceb659/volumes" Mar 07 21:45:14.475749 master-0 kubenswrapper[16352]: I0307 21:45:14.475639 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:45:14.476951 master-0 kubenswrapper[16352]: I0307 21:45:14.475816 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 07 21:45:16.443757 master-0 kubenswrapper[16352]: I0307 21:45:16.443607 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:45:16.443757 master-0 kubenswrapper[16352]: I0307 21:45:16.443777 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 07 21:45:16.629879 master-0 kubenswrapper[16352]: I0307 21:45:16.629808 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 07 21:45:17.469943 master-0 kubenswrapper[16352]: I0307 21:45:17.469809 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20b6be5e-a0d3-491d-80d9-dfe0a69e19d6" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.128.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:45:17.471183 master-0 kubenswrapper[16352]: I0307 21:45:17.469918 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="20b6be5e-a0d3-491d-80d9-dfe0a69e19d6" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.128.1.16:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:45:19.476622 master-0 kubenswrapper[16352]: I0307 21:45:19.476548 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 21:45:19.476622 master-0 kubenswrapper[16352]: I0307 21:45:19.476615 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 07 21:45:20.299121 master-0 kubenswrapper[16352]: E0307 21:45:20.298924 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:20.496952 master-0 kubenswrapper[16352]: I0307 21:45:20.496866 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a74b825-c8eb-4977-8524-80215b306e51" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.128.1.17:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:45:20.497702 master-0 kubenswrapper[16352]: I0307 21:45:20.497285 16352 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="0a74b825-c8eb-4977-8524-80215b306e51" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.128.1.17:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 07 21:45:20.571830 master-0 kubenswrapper[16352]: E0307 21:45:20.568579 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:21.630345 master-0 kubenswrapper[16352]: I0307 21:45:21.630233 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 07 21:45:21.688207 master-0 kubenswrapper[16352]: I0307 21:45:21.688137 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 07 21:45:22.328004 master-0 kubenswrapper[16352]: I0307 21:45:22.327898 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 07 21:45:26.455158 master-0 kubenswrapper[16352]: I0307 21:45:26.455085 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 21:45:26.456557 master-0 kubenswrapper[16352]: I0307 21:45:26.456453 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 21:45:26.457006 master-0 kubenswrapper[16352]: I0307 21:45:26.456967 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 07 21:45:26.465648 master-0 kubenswrapper[16352]: I0307 21:45:26.465553 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 21:45:27.427455 master-0 kubenswrapper[16352]: I0307 21:45:27.427385 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 07 21:45:27.524432 master-0 kubenswrapper[16352]: I0307 21:45:27.524357 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 07 21:45:29.483176 master-0 kubenswrapper[16352]: I0307 21:45:29.483082 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 21:45:29.487044 master-0 kubenswrapper[16352]: I0307 21:45:29.486967 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 07 21:45:29.498443 master-0 kubenswrapper[16352]: I0307 21:45:29.498364 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 21:45:30.476279 master-0 kubenswrapper[16352]: I0307 21:45:30.476171 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 07 21:45:30.747593 master-0 kubenswrapper[16352]: E0307 21:45:30.747443 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:35.811375 master-0 kubenswrapper[16352]: E0307 21:45:35.811281 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:37.644082 master-0 kubenswrapper[16352]: E0307 21:45:37.641884 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:37.644082 master-0 kubenswrapper[16352]: E0307 21:45:37.642274 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:41.127297 master-0 kubenswrapper[16352]: E0307 21:45:41.127185 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:50.908963 master-0 kubenswrapper[16352]: E0307 21:45:50.908879 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:51.212189 master-0 kubenswrapper[16352]: E0307 21:45:51.211979 16352 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6a083cbc_d960_42c1_841f_8c0a8b262f87.slice\": RecentStats: unable to find data in memory cache]" Mar 07 21:45:56.733931 master-0 kubenswrapper[16352]: I0307 21:45:56.733859 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:45:56.734956 master-0 kubenswrapper[16352]: I0307 21:45:56.734900 16352 kuberuntime_container.go:808] "Killing container with a grace period" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" podUID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" containerName="sushy-emulator" containerID="cri-o://85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926" gracePeriod=30 Mar 07 21:45:57.429186 master-0 kubenswrapper[16352]: I0307 21:45:57.429116 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:45:57.512362 master-0 kubenswrapper[16352]: I0307 21:45:57.512303 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config\") pod \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " Mar 07 21:45:57.512514 master-0 kubenswrapper[16352]: I0307 21:45:57.512426 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-66kln\" (UniqueName: \"kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln\") pod \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " Mar 07 21:45:57.512684 master-0 kubenswrapper[16352]: I0307 21:45:57.512650 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config\") pod \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\" (UID: \"56c0a57c-e9dd-4f2a-8e20-045a2ca28321\") " Mar 07 21:45:57.513778 master-0 kubenswrapper[16352]: I0307 21:45:57.513652 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config" (OuterVolumeSpecName: "sushy-emulator-config") pod "56c0a57c-e9dd-4f2a-8e20-045a2ca28321" (UID: "56c0a57c-e9dd-4f2a-8e20-045a2ca28321"). InnerVolumeSpecName "sushy-emulator-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 07 21:45:57.536832 master-0 kubenswrapper[16352]: I0307 21:45:57.523815 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config" (OuterVolumeSpecName: "os-client-config") pod "56c0a57c-e9dd-4f2a-8e20-045a2ca28321" (UID: "56c0a57c-e9dd-4f2a-8e20-045a2ca28321"). InnerVolumeSpecName "os-client-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 21:45:57.552492 master-0 kubenswrapper[16352]: I0307 21:45:57.552257 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln" (OuterVolumeSpecName: "kube-api-access-66kln") pod "56c0a57c-e9dd-4f2a-8e20-045a2ca28321" (UID: "56c0a57c-e9dd-4f2a-8e20-045a2ca28321"). InnerVolumeSpecName "kube-api-access-66kln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 21:45:57.601120 master-0 kubenswrapper[16352]: I0307 21:45:57.601037 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-9qs5d"] Mar 07 21:45:57.601924 master-0 kubenswrapper[16352]: E0307 21:45:57.601888 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" containerName="sushy-emulator" Mar 07 21:45:57.601924 master-0 kubenswrapper[16352]: I0307 21:45:57.601920 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" containerName="sushy-emulator" Mar 07 21:45:57.602310 master-0 kubenswrapper[16352]: I0307 21:45:57.602276 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" containerName="sushy-emulator" Mar 07 21:45:57.603454 master-0 kubenswrapper[16352]: I0307 21:45:57.603410 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.619819 master-0 kubenswrapper[16352]: I0307 21:45:57.619631 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ea9e761d-fb17-42fd-8ba4-71d25d655143-os-client-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.619956 master-0 kubenswrapper[16352]: I0307 21:45:57.619897 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ea9e761d-fb17-42fd-8ba4-71d25d655143-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.620083 master-0 kubenswrapper[16352]: I0307 21:45:57.620042 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-65fwz\" (UniqueName: \"kubernetes.io/projected/ea9e761d-fb17-42fd-8ba4-71d25d655143-kube-api-access-65fwz\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.620225 master-0 kubenswrapper[16352]: I0307 21:45:57.620192 16352 reconciler_common.go:293] "Volume detached for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-sushy-emulator-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:57.620225 master-0 kubenswrapper[16352]: I0307 21:45:57.620217 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-66kln\" (UniqueName: \"kubernetes.io/projected/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-kube-api-access-66kln\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:57.620319 master-0 kubenswrapper[16352]: I0307 21:45:57.620230 16352 reconciler_common.go:293] "Volume detached for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/56c0a57c-e9dd-4f2a-8e20-045a2ca28321-os-client-config\") on node \"master-0\" DevicePath \"\"" Mar 07 21:45:57.629144 master-0 kubenswrapper[16352]: I0307 21:45:57.624034 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-9qs5d"] Mar 07 21:45:57.722379 master-0 kubenswrapper[16352]: I0307 21:45:57.722314 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ea9e761d-fb17-42fd-8ba4-71d25d655143-os-client-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.722575 master-0 kubenswrapper[16352]: I0307 21:45:57.722474 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ea9e761d-fb17-42fd-8ba4-71d25d655143-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.722653 master-0 kubenswrapper[16352]: I0307 21:45:57.722576 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-65fwz\" (UniqueName: \"kubernetes.io/projected/ea9e761d-fb17-42fd-8ba4-71d25d655143-kube-api-access-65fwz\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.725446 master-0 kubenswrapper[16352]: I0307 21:45:57.725247 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sushy-emulator-config\" (UniqueName: \"kubernetes.io/configmap/ea9e761d-fb17-42fd-8ba4-71d25d655143-sushy-emulator-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.730714 master-0 kubenswrapper[16352]: I0307 21:45:57.730608 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-client-config\" (UniqueName: \"kubernetes.io/secret/ea9e761d-fb17-42fd-8ba4-71d25d655143-os-client-config\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.759969 master-0 kubenswrapper[16352]: I0307 21:45:57.759852 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-65fwz\" (UniqueName: \"kubernetes.io/projected/ea9e761d-fb17-42fd-8ba4-71d25d655143-kube-api-access-65fwz\") pod \"sushy-emulator-84965d5d88-9qs5d\" (UID: \"ea9e761d-fb17-42fd-8ba4-71d25d655143\") " pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.950633 master-0 kubenswrapper[16352]: I0307 21:45:57.950519 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:45:57.957258 master-0 kubenswrapper[16352]: I0307 21:45:57.957193 16352 generic.go:334] "Generic (PLEG): container finished" podID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" containerID="85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926" exitCode=0 Mar 07 21:45:57.957368 master-0 kubenswrapper[16352]: I0307 21:45:57.957265 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" event={"ID":"56c0a57c-e9dd-4f2a-8e20-045a2ca28321","Type":"ContainerDied","Data":"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926"} Mar 07 21:45:57.957368 master-0 kubenswrapper[16352]: I0307 21:45:57.957308 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" event={"ID":"56c0a57c-e9dd-4f2a-8e20-045a2ca28321","Type":"ContainerDied","Data":"afcb17a3a3f3d74ea9ce256878a7ddc858abc2f1ff611c99ad43b4f92ca74f13"} Mar 07 21:45:57.957368 master-0 kubenswrapper[16352]: I0307 21:45:57.957340 16352 scope.go:117] "RemoveContainer" containerID="85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926" Mar 07 21:45:57.957501 master-0 kubenswrapper[16352]: I0307 21:45:57.957271 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="sushy-emulator/sushy-emulator-78f6d7d749-xgc79" Mar 07 21:45:58.005970 master-0 kubenswrapper[16352]: I0307 21:45:58.005861 16352 scope.go:117] "RemoveContainer" containerID="85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926" Mar 07 21:45:58.006858 master-0 kubenswrapper[16352]: E0307 21:45:58.006771 16352 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926\": container with ID starting with 85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926 not found: ID does not exist" containerID="85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926" Mar 07 21:45:58.006939 master-0 kubenswrapper[16352]: I0307 21:45:58.006864 16352 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926"} err="failed to get container status \"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926\": rpc error: code = NotFound desc = could not find container \"85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926\": container with ID starting with 85f003a1470db6056f5841941182dfb4dc2adeb660b7450d817f6e9ba8599926 not found: ID does not exist" Mar 07 21:45:58.019984 master-0 kubenswrapper[16352]: I0307 21:45:58.019871 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:45:58.038590 master-0 kubenswrapper[16352]: I0307 21:45:58.038454 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["sushy-emulator/sushy-emulator-78f6d7d749-xgc79"] Mar 07 21:45:58.609677 master-0 kubenswrapper[16352]: I0307 21:45:58.609570 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["sushy-emulator/sushy-emulator-84965d5d88-9qs5d"] Mar 07 21:45:58.976315 master-0 kubenswrapper[16352]: I0307 21:45:58.976244 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" event={"ID":"ea9e761d-fb17-42fd-8ba4-71d25d655143","Type":"ContainerStarted","Data":"45aea26b20a004b27d56bb8ccaea96f2d22cd7da8c97c2600d08efe80281a8b2"} Mar 07 21:45:58.976315 master-0 kubenswrapper[16352]: I0307 21:45:58.976309 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" event={"ID":"ea9e761d-fb17-42fd-8ba4-71d25d655143","Type":"ContainerStarted","Data":"6a9bbff5ac9725de4a3202420dd1f76b9cdd49ca5debf348578db5871033256d"} Mar 07 21:45:59.016187 master-0 kubenswrapper[16352]: I0307 21:45:59.016069 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" podStartSLOduration=2.016039176 podStartE2EDuration="2.016039176s" podCreationTimestamp="2026-03-07 21:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 21:45:59.001420616 +0000 UTC m=+1682.072125675" watchObservedRunningTime="2026-03-07 21:45:59.016039176 +0000 UTC m=+1682.086744245" Mar 07 21:45:59.209532 master-0 kubenswrapper[16352]: I0307 21:45:59.209424 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56c0a57c-e9dd-4f2a-8e20-045a2ca28321" path="/var/lib/kubelet/pods/56c0a57c-e9dd-4f2a-8e20-045a2ca28321/volumes" Mar 07 21:46:07.952161 master-0 kubenswrapper[16352]: I0307 21:46:07.952085 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:46:07.952994 master-0 kubenswrapper[16352]: I0307 21:46:07.952186 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:46:07.963623 master-0 kubenswrapper[16352]: I0307 21:46:07.963550 16352 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:46:08.125152 master-0 kubenswrapper[16352]: I0307 21:46:08.125021 16352 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="sushy-emulator/sushy-emulator-84965d5d88-9qs5d" Mar 07 21:47:14.118156 master-0 kubenswrapper[16352]: I0307 21:47:14.118042 16352 scope.go:117] "RemoveContainer" containerID="a23d62c841cede038c0d7eaae41e12ad84dcebabbcd19790cc9a3deb18c0b09d" Mar 07 21:47:19.748672 master-0 kubenswrapper[16352]: E0307 21:47:19.748599 16352 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 192.168.32.10:49984->192.168.32.10:42573: write tcp 192.168.32.10:49984->192.168.32.10:42573: write: broken pipe Mar 07 21:48:14.210634 master-0 kubenswrapper[16352]: I0307 21:48:14.210528 16352 scope.go:117] "RemoveContainer" containerID="492bceae39af68bf199877a517d49ca13c19ec0d954cb2d9bd2b30513d6aaddf" Mar 07 21:48:14.249220 master-0 kubenswrapper[16352]: I0307 21:48:14.249134 16352 scope.go:117] "RemoveContainer" containerID="fc994383f7876be1e38177029dc4289405bb594f54e4d51c6a4f561d4a6fb893" Mar 07 21:49:14.368822 master-0 kubenswrapper[16352]: I0307 21:49:14.368720 16352 scope.go:117] "RemoveContainer" containerID="0c3281a5a62aa7167b934e02e0ce4b52dd36eac632736db435e2f8e4a252d234" Mar 07 21:49:14.404178 master-0 kubenswrapper[16352]: I0307 21:49:14.404102 16352 scope.go:117] "RemoveContainer" containerID="be583d403489b6e6d7035f5a522e0cb98d638b8ae91cbbd533cf866161a60b68" Mar 07 21:51:01.089466 master-0 kubenswrapper[16352]: I0307 21:51:01.089341 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-8jd9b"] Mar 07 21:51:01.108828 master-0 kubenswrapper[16352]: I0307 21:51:01.108678 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-shqgh"] Mar 07 21:51:01.124791 master-0 kubenswrapper[16352]: I0307 21:51:01.124717 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-8jd9b"] Mar 07 21:51:01.136748 master-0 kubenswrapper[16352]: I0307 21:51:01.136650 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-shqgh"] Mar 07 21:51:01.208030 master-0 kubenswrapper[16352]: I0307 21:51:01.207930 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16841f85-d2fc-46a7-8c09-c957cb2cfb4f" path="/var/lib/kubelet/pods/16841f85-d2fc-46a7-8c09-c957cb2cfb4f/volumes" Mar 07 21:51:01.208906 master-0 kubenswrapper[16352]: I0307 21:51:01.208859 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5585c07c-5a03-46df-a14b-0cd2550e7ca1" path="/var/lib/kubelet/pods/5585c07c-5a03-46df-a14b-0cd2550e7ca1/volumes" Mar 07 21:51:02.089098 master-0 kubenswrapper[16352]: I0307 21:51:02.089021 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-q8tkd"] Mar 07 21:51:02.110577 master-0 kubenswrapper[16352]: I0307 21:51:02.110476 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-c490-account-create-update-rc6gq"] Mar 07 21:51:02.122014 master-0 kubenswrapper[16352]: I0307 21:51:02.121954 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-326e-account-create-update-g8dbq"] Mar 07 21:51:02.133517 master-0 kubenswrapper[16352]: I0307 21:51:02.133429 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-3631-account-create-update-8m8jf"] Mar 07 21:51:02.144254 master-0 kubenswrapper[16352]: I0307 21:51:02.144216 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-q8tkd"] Mar 07 21:51:02.154518 master-0 kubenswrapper[16352]: I0307 21:51:02.154464 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-c490-account-create-update-rc6gq"] Mar 07 21:51:02.165362 master-0 kubenswrapper[16352]: I0307 21:51:02.165300 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-326e-account-create-update-g8dbq"] Mar 07 21:51:02.176030 master-0 kubenswrapper[16352]: I0307 21:51:02.175796 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-3631-account-create-update-8m8jf"] Mar 07 21:51:03.210507 master-0 kubenswrapper[16352]: I0307 21:51:03.210411 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="732d7db0-01b4-4187-92d9-01e6a04f91f8" path="/var/lib/kubelet/pods/732d7db0-01b4-4187-92d9-01e6a04f91f8/volumes" Mar 07 21:51:03.211646 master-0 kubenswrapper[16352]: I0307 21:51:03.211522 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b1e7528-a696-43e8-b487-d981fb460467" path="/var/lib/kubelet/pods/7b1e7528-a696-43e8-b487-d981fb460467/volumes" Mar 07 21:51:03.212506 master-0 kubenswrapper[16352]: I0307 21:51:03.212444 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="842cfecc-7700-4a61-a685-c211db763dcb" path="/var/lib/kubelet/pods/842cfecc-7700-4a61-a685-c211db763dcb/volumes" Mar 07 21:51:03.213398 master-0 kubenswrapper[16352]: I0307 21:51:03.213349 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc6fea59-4e62-414e-a8c0-d6c6a60fb72c" path="/var/lib/kubelet/pods/dc6fea59-4e62-414e-a8c0-d6c6a60fb72c/volumes" Mar 07 21:51:06.060871 master-0 kubenswrapper[16352]: I0307 21:51:06.060740 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-m995r"] Mar 07 21:51:06.079911 master-0 kubenswrapper[16352]: I0307 21:51:06.079784 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-m995r"] Mar 07 21:51:07.234426 master-0 kubenswrapper[16352]: I0307 21:51:07.234223 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ae64be7-df95-407d-bcd8-732b98e9df90" path="/var/lib/kubelet/pods/2ae64be7-df95-407d-bcd8-732b98e9df90/volumes" Mar 07 21:51:14.527337 master-0 kubenswrapper[16352]: I0307 21:51:14.527247 16352 scope.go:117] "RemoveContainer" containerID="01516de94ecdfbaf37dc0c71f462dcf88bea038636c0475dffbc097dacbadf8c" Mar 07 21:51:14.564585 master-0 kubenswrapper[16352]: I0307 21:51:14.564346 16352 scope.go:117] "RemoveContainer" containerID="e56be3b2b266c11a944e77332dce1c1cf807a1b6fe3a182e779e02a1245f5cd2" Mar 07 21:51:14.594916 master-0 kubenswrapper[16352]: I0307 21:51:14.594812 16352 scope.go:117] "RemoveContainer" containerID="a1e576e932db5e2f3c69c6aad98cf8f3e15fcea20e6c94144b9af8de58570c1c" Mar 07 21:51:14.632911 master-0 kubenswrapper[16352]: I0307 21:51:14.632834 16352 scope.go:117] "RemoveContainer" containerID="d05854d3fa0625cdd2ed64a244c92ef91acfa81fc8d83d589e8979a8f3bbb33b" Mar 07 21:51:14.685408 master-0 kubenswrapper[16352]: I0307 21:51:14.685349 16352 scope.go:117] "RemoveContainer" containerID="e6ec93a78a1f7a3aa6d6df7415329166122121d250cbbce9fea6fadde9efeca3" Mar 07 21:51:14.723255 master-0 kubenswrapper[16352]: I0307 21:51:14.723211 16352 scope.go:117] "RemoveContainer" containerID="6be1d1f530d0c7bdbc0d1c8090301072dfb8782fe72d092e1f23a3f4350c5f5c" Mar 07 21:51:14.762378 master-0 kubenswrapper[16352]: I0307 21:51:14.762334 16352 scope.go:117] "RemoveContainer" containerID="e1e87fade9c512cebf40ef76d354b9544a5dd6710efebf3df25c1d874089ecab" Mar 07 21:51:14.796647 master-0 kubenswrapper[16352]: I0307 21:51:14.796577 16352 scope.go:117] "RemoveContainer" containerID="3046e1b94ffdc03a870af1c7395289a410b6ce16ca34eeedf5880171425e18dd" Mar 07 21:51:14.836632 master-0 kubenswrapper[16352]: I0307 21:51:14.836581 16352 scope.go:117] "RemoveContainer" containerID="a6e2a25b80eca6e549dfa1628abd2c09396150183fe670303541bdccbe0f9072" Mar 07 21:51:35.101470 master-0 kubenswrapper[16352]: I0307 21:51:35.101367 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-5hn4x"] Mar 07 21:51:35.122093 master-0 kubenswrapper[16352]: I0307 21:51:35.121989 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-b4a7-account-create-update-xgrjd"] Mar 07 21:51:35.150943 master-0 kubenswrapper[16352]: I0307 21:51:35.150838 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-b4a7-account-create-update-xgrjd"] Mar 07 21:51:35.169791 master-0 kubenswrapper[16352]: I0307 21:51:35.168772 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-5hn4x"] Mar 07 21:51:35.217303 master-0 kubenswrapper[16352]: I0307 21:51:35.216411 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7aed4fb8-bf15-4337-8567-30ad6ff11ce3" path="/var/lib/kubelet/pods/7aed4fb8-bf15-4337-8567-30ad6ff11ce3/volumes" Mar 07 21:51:35.217303 master-0 kubenswrapper[16352]: I0307 21:51:35.217040 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e3a1e884-7f03-4892-ab50-fa35f704f9a1" path="/var/lib/kubelet/pods/e3a1e884-7f03-4892-ab50-fa35f704f9a1/volumes" Mar 07 21:51:36.079978 master-0 kubenswrapper[16352]: I0307 21:51:36.079885 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-s9668"] Mar 07 21:51:36.092561 master-0 kubenswrapper[16352]: I0307 21:51:36.092473 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-dkp4f"] Mar 07 21:51:36.105732 master-0 kubenswrapper[16352]: I0307 21:51:36.105661 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-66da-account-create-update-cczxb"] Mar 07 21:51:36.121616 master-0 kubenswrapper[16352]: I0307 21:51:36.119872 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-s9668"] Mar 07 21:51:36.131036 master-0 kubenswrapper[16352]: I0307 21:51:36.130974 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-dkp4f"] Mar 07 21:51:36.143254 master-0 kubenswrapper[16352]: I0307 21:51:36.142481 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-66da-account-create-update-cczxb"] Mar 07 21:51:37.219859 master-0 kubenswrapper[16352]: I0307 21:51:37.219596 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a8458eb-8fd1-4415-871a-4a9b45f21de9" path="/var/lib/kubelet/pods/1a8458eb-8fd1-4415-871a-4a9b45f21de9/volumes" Mar 07 21:51:37.221196 master-0 kubenswrapper[16352]: I0307 21:51:37.221129 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3fd3e133-30c5-4efb-9522-ee8423491e95" path="/var/lib/kubelet/pods/3fd3e133-30c5-4efb-9522-ee8423491e95/volumes" Mar 07 21:51:37.230459 master-0 kubenswrapper[16352]: I0307 21:51:37.230386 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c53b3432-7649-440a-b109-bb48be9f10c7" path="/var/lib/kubelet/pods/c53b3432-7649-440a-b109-bb48be9f10c7/volumes" Mar 07 21:51:41.064747 master-0 kubenswrapper[16352]: I0307 21:51:41.062342 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-cmr5t"] Mar 07 21:51:41.102809 master-0 kubenswrapper[16352]: I0307 21:51:41.102244 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-cmr5t"] Mar 07 21:51:41.220302 master-0 kubenswrapper[16352]: I0307 21:51:41.220188 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68d35360-cb8c-439f-8efe-f84fa02416e8" path="/var/lib/kubelet/pods/68d35360-cb8c-439f-8efe-f84fa02416e8/volumes" Mar 07 21:51:48.044337 master-0 kubenswrapper[16352]: I0307 21:51:48.044102 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-create-j9hg2"] Mar 07 21:51:48.059874 master-0 kubenswrapper[16352]: I0307 21:51:48.059769 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-create-j9hg2"] Mar 07 21:51:49.215431 master-0 kubenswrapper[16352]: I0307 21:51:49.215140 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="673480b0-be7b-453c-b7b3-8646042b3e59" path="/var/lib/kubelet/pods/673480b0-be7b-453c-b7b3-8646042b3e59/volumes" Mar 07 21:51:51.070265 master-0 kubenswrapper[16352]: I0307 21:51:51.070145 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-20ba-account-create-update-4dtlr"] Mar 07 21:51:51.087582 master-0 kubenswrapper[16352]: I0307 21:51:51.087489 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-20ba-account-create-update-4dtlr"] Mar 07 21:51:51.246510 master-0 kubenswrapper[16352]: I0307 21:51:51.246414 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d3d63cde-c147-4e68-8491-753368687501" path="/var/lib/kubelet/pods/d3d63cde-c147-4e68-8491-753368687501/volumes" Mar 07 21:52:05.091808 master-0 kubenswrapper[16352]: I0307 21:52:05.091653 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-4wkkv"] Mar 07 21:52:05.107777 master-0 kubenswrapper[16352]: I0307 21:52:05.107659 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-4wkkv"] Mar 07 21:52:05.207312 master-0 kubenswrapper[16352]: I0307 21:52:05.207187 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0c040c8-ff93-4fd1-8f24-41bf2e0a8985" path="/var/lib/kubelet/pods/b0c040c8-ff93-4fd1-8f24-41bf2e0a8985/volumes" Mar 07 21:52:15.068705 master-0 kubenswrapper[16352]: I0307 21:52:15.067041 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-zh2n5"] Mar 07 21:52:15.092713 master-0 kubenswrapper[16352]: I0307 21:52:15.090897 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-zh2n5"] Mar 07 21:52:15.129656 master-0 kubenswrapper[16352]: I0307 21:52:15.129568 16352 scope.go:117] "RemoveContainer" containerID="ddfa45451932b9f4486cf12c7cb8a7826c3eba8173a69005fcee8d6abe5fcaf8" Mar 07 21:52:15.151860 master-0 kubenswrapper[16352]: I0307 21:52:15.151790 16352 scope.go:117] "RemoveContainer" containerID="26c2967c8a278140c4cf0ec5f5f1a8d674f8cc49abe5d0529afedb2063bf41c2" Mar 07 21:52:15.183425 master-0 kubenswrapper[16352]: I0307 21:52:15.183345 16352 scope.go:117] "RemoveContainer" containerID="ce0343eb6e31502152abf1d3207c481b00edb500c99ae5b13af3b1b83142fd0f" Mar 07 21:52:15.207284 master-0 kubenswrapper[16352]: I0307 21:52:15.207147 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bcad141-6b4f-4b5c-a3e0-236d54fe850c" path="/var/lib/kubelet/pods/2bcad141-6b4f-4b5c-a3e0-236d54fe850c/volumes" Mar 07 21:52:15.215998 master-0 kubenswrapper[16352]: I0307 21:52:15.215948 16352 scope.go:117] "RemoveContainer" containerID="0909e7a52b3666dad29a2718ce2344084b76934cbf54715aff39eb6e54b4bde1" Mar 07 21:52:15.243387 master-0 kubenswrapper[16352]: I0307 21:52:15.243302 16352 scope.go:117] "RemoveContainer" containerID="2290b63b0ddc2e5a9d606527b15e706a487fb1316d1c8f0fb7ec06686188f238" Mar 07 21:52:15.269742 master-0 kubenswrapper[16352]: I0307 21:52:15.269627 16352 scope.go:117] "RemoveContainer" containerID="d91ddbb60f350d9a8a522f80755adc3279c6c3272b25bb812f758dde17d24b22" Mar 07 21:52:15.292659 master-0 kubenswrapper[16352]: I0307 21:52:15.292589 16352 scope.go:117] "RemoveContainer" containerID="c84f49f70e57b95ce5538ee514920d8345a901e6861c316e5479d3149141e5f4" Mar 07 21:52:15.317594 master-0 kubenswrapper[16352]: I0307 21:52:15.317243 16352 scope.go:117] "RemoveContainer" containerID="3ff1bfa4c48957b3ebdbb387bb896638f5e87c9f70c2d14b5c74643ae2764dc6" Mar 07 21:52:15.344480 master-0 kubenswrapper[16352]: I0307 21:52:15.344082 16352 scope.go:117] "RemoveContainer" containerID="1492792eff1f8c3692e3c3184a969fc94ca5abbc701364a6b2c15146a925042a" Mar 07 21:52:15.373413 master-0 kubenswrapper[16352]: I0307 21:52:15.373361 16352 scope.go:117] "RemoveContainer" containerID="2268a2fafaeba584ed472288cc56505476177236c2d68526b6349a3f1741d5b3" Mar 07 21:52:20.057308 master-0 kubenswrapper[16352]: I0307 21:52:20.057197 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-86971-db-sync-m7xht"] Mar 07 21:52:20.072885 master-0 kubenswrapper[16352]: I0307 21:52:20.072782 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-86971-db-sync-m7xht"] Mar 07 21:52:21.208327 master-0 kubenswrapper[16352]: I0307 21:52:21.208230 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c17f59fb-df31-45d5-9077-ac10aa310af2" path="/var/lib/kubelet/pods/c17f59fb-df31-45d5-9077-ac10aa310af2/volumes" Mar 07 21:52:23.071026 master-0 kubenswrapper[16352]: I0307 21:52:23.070902 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-97jz8"] Mar 07 21:52:23.089745 master-0 kubenswrapper[16352]: I0307 21:52:23.089598 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-97jz8"] Mar 07 21:52:23.212953 master-0 kubenswrapper[16352]: I0307 21:52:23.212877 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="218021bb-e4db-42b1-a553-f2a373cd9565" path="/var/lib/kubelet/pods/218021bb-e4db-42b1-a553-f2a373cd9565/volumes" Mar 07 21:52:32.067408 master-0 kubenswrapper[16352]: I0307 21:52:32.065943 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-db-sync-mtvqh"] Mar 07 21:52:32.087190 master-0 kubenswrapper[16352]: I0307 21:52:32.087019 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-db-sync-mtvqh"] Mar 07 21:52:33.224267 master-0 kubenswrapper[16352]: I0307 21:52:33.224133 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2a6736c6-a65f-4821-91a1-747418c62459" path="/var/lib/kubelet/pods/2a6736c6-a65f-4821-91a1-747418c62459/volumes" Mar 07 21:52:39.052509 master-0 kubenswrapper[16352]: I0307 21:52:39.052421 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-create-sdzv8"] Mar 07 21:52:39.081279 master-0 kubenswrapper[16352]: I0307 21:52:39.081128 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-create-sdzv8"] Mar 07 21:52:39.208074 master-0 kubenswrapper[16352]: I0307 21:52:39.207979 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2528cb-3b05-45ee-adf7-67e32faaab12" path="/var/lib/kubelet/pods/ac2528cb-3b05-45ee-adf7-67e32faaab12/volumes" Mar 07 21:52:41.065389 master-0 kubenswrapper[16352]: I0307 21:52:41.065301 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-d904-account-create-update-tc485"] Mar 07 21:52:41.088648 master-0 kubenswrapper[16352]: I0307 21:52:41.088551 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-d904-account-create-update-tc485"] Mar 07 21:52:41.221192 master-0 kubenswrapper[16352]: I0307 21:52:41.221046 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77f0edd2-211c-423c-b49f-d2c69df20f23" path="/var/lib/kubelet/pods/77f0edd2-211c-423c-b49f-d2c69df20f23/volumes" Mar 07 21:52:57.077398 master-0 kubenswrapper[16352]: I0307 21:52:57.076836 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ironic-inspector-db-sync-hst88"] Mar 07 21:52:57.096834 master-0 kubenswrapper[16352]: I0307 21:52:57.096730 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ironic-inspector-db-sync-hst88"] Mar 07 21:52:57.212809 master-0 kubenswrapper[16352]: I0307 21:52:57.212641 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4c65c147-410b-4022-9104-20fb7c362674" path="/var/lib/kubelet/pods/4c65c147-410b-4022-9104-20fb7c362674/volumes" Mar 07 21:53:15.712952 master-0 kubenswrapper[16352]: I0307 21:53:15.712848 16352 scope.go:117] "RemoveContainer" containerID="180a5b72cc6e6eaf2380e30c0c49902fc2604b0b10fbf55bb493c9857cb4c5c9" Mar 07 21:53:15.745469 master-0 kubenswrapper[16352]: I0307 21:53:15.745334 16352 scope.go:117] "RemoveContainer" containerID="8e7336a0eb6ab818f0c8d258f43028b48d1c7900ff93861475b25e2a1a77ecd8" Mar 07 21:53:15.782185 master-0 kubenswrapper[16352]: I0307 21:53:15.782113 16352 scope.go:117] "RemoveContainer" containerID="1544aaf023ac14ad99649dc624366028793977da9454b79d8e5eba2a3a57847e" Mar 07 21:53:15.824804 master-0 kubenswrapper[16352]: I0307 21:53:15.824540 16352 scope.go:117] "RemoveContainer" containerID="97a4dba59598ff3cd37a1a636d9fd3ae198b019c02d082f9d48e18bddc3419bf" Mar 07 21:53:15.874804 master-0 kubenswrapper[16352]: I0307 21:53:15.874724 16352 scope.go:117] "RemoveContainer" containerID="7bd06b9455c7b486007744004863bc8005931459c8f25e88ae3f5497ac588905" Mar 07 21:53:15.909910 master-0 kubenswrapper[16352]: I0307 21:53:15.909833 16352 scope.go:117] "RemoveContainer" containerID="12f5f40a8fa04773bbd3cbec144234660f8987f2ad2157088c2d0af0fb8c14f7" Mar 07 21:53:15.945493 master-0 kubenswrapper[16352]: I0307 21:53:15.945360 16352 scope.go:117] "RemoveContainer" containerID="6cdac522b3b5a0c31b3bde57331a76e735d384d73d686550f0182c05c512a505" Mar 07 21:53:19.089261 master-0 kubenswrapper[16352]: I0307 21:53:19.089181 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-0300-account-create-update-b66m5"] Mar 07 21:53:19.108746 master-0 kubenswrapper[16352]: I0307 21:53:19.108643 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-94ssk"] Mar 07 21:53:19.122915 master-0 kubenswrapper[16352]: I0307 21:53:19.122843 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-2b75-account-create-update-gqckp"] Mar 07 21:53:19.141883 master-0 kubenswrapper[16352]: I0307 21:53:19.141808 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-0300-account-create-update-b66m5"] Mar 07 21:53:19.157881 master-0 kubenswrapper[16352]: I0307 21:53:19.157809 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-64285"] Mar 07 21:53:19.173449 master-0 kubenswrapper[16352]: I0307 21:53:19.173396 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-94ssk"] Mar 07 21:53:19.186754 master-0 kubenswrapper[16352]: I0307 21:53:19.186698 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-26xt5"] Mar 07 21:53:19.209628 master-0 kubenswrapper[16352]: I0307 21:53:19.209532 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38ea555e-6a67-483f-96a2-9587104b0c38" path="/var/lib/kubelet/pods/38ea555e-6a67-483f-96a2-9587104b0c38/volumes" Mar 07 21:53:19.211121 master-0 kubenswrapper[16352]: I0307 21:53:19.211080 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6bfd3214-eccf-402a-93de-6bd7f2cb2c08" path="/var/lib/kubelet/pods/6bfd3214-eccf-402a-93de-6bd7f2cb2c08/volumes" Mar 07 21:53:19.213404 master-0 kubenswrapper[16352]: I0307 21:53:19.213357 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-64285"] Mar 07 21:53:19.216953 master-0 kubenswrapper[16352]: I0307 21:53:19.216797 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-2b75-account-create-update-gqckp"] Mar 07 21:53:19.227485 master-0 kubenswrapper[16352]: I0307 21:53:19.227415 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-26xt5"] Mar 07 21:53:20.052880 master-0 kubenswrapper[16352]: I0307 21:53:20.052767 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-8a73-account-create-update-s57x2"] Mar 07 21:53:20.072597 master-0 kubenswrapper[16352]: I0307 21:53:20.072505 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-8a73-account-create-update-s57x2"] Mar 07 21:53:21.209514 master-0 kubenswrapper[16352]: I0307 21:53:21.209427 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb" path="/var/lib/kubelet/pods/0b9e29ee-7728-4a1a-ad2a-9ba175a5dfeb/volumes" Mar 07 21:53:21.212238 master-0 kubenswrapper[16352]: I0307 21:53:21.212173 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2130caf7-6f24-4cb7-a216-e60f2b951f4a" path="/var/lib/kubelet/pods/2130caf7-6f24-4cb7-a216-e60f2b951f4a/volumes" Mar 07 21:53:21.213940 master-0 kubenswrapper[16352]: I0307 21:53:21.213903 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b487d67-175f-402c-883f-a4001fd9160c" path="/var/lib/kubelet/pods/2b487d67-175f-402c-883f-a4001fd9160c/volumes" Mar 07 21:53:21.215279 master-0 kubenswrapper[16352]: I0307 21:53:21.215246 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5793afc9-06c2-497b-ab66-92254e79e871" path="/var/lib/kubelet/pods/5793afc9-06c2-497b-ab66-92254e79e871/volumes" Mar 07 21:53:55.101616 master-0 kubenswrapper[16352]: I0307 21:53:55.100975 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xm4p"] Mar 07 21:53:55.125798 master-0 kubenswrapper[16352]: I0307 21:53:55.125663 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-9xm4p"] Mar 07 21:53:55.214517 master-0 kubenswrapper[16352]: I0307 21:53:55.214409 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dec069dd-1a94-4b25-95f1-1346f25cf204" path="/var/lib/kubelet/pods/dec069dd-1a94-4b25-95f1-1346f25cf204/volumes" Mar 07 21:54:16.160290 master-0 kubenswrapper[16352]: I0307 21:54:16.160184 16352 scope.go:117] "RemoveContainer" containerID="c549b034ce2d76254f58082bb77f95e290dc619e8499900094d1366d9585e905" Mar 07 21:54:16.196573 master-0 kubenswrapper[16352]: I0307 21:54:16.196499 16352 scope.go:117] "RemoveContainer" containerID="0f7ccd9b7255d1f82a4e994599fcbae5a990d628bf555148310806c85f3742e8" Mar 07 21:54:16.236381 master-0 kubenswrapper[16352]: I0307 21:54:16.236284 16352 scope.go:117] "RemoveContainer" containerID="6a89908eff7c42fec7d5f6d4895174aaf23c7087916ae432ee41bc16adbf7c84" Mar 07 21:54:16.271562 master-0 kubenswrapper[16352]: I0307 21:54:16.271505 16352 scope.go:117] "RemoveContainer" containerID="a57de8bad7fe7a4bc9a511bfc6a99d1c0b27b61a43c65d2d32cc3ef5921f3cb5" Mar 07 21:54:16.307170 master-0 kubenswrapper[16352]: I0307 21:54:16.307092 16352 scope.go:117] "RemoveContainer" containerID="15ecc592ea1265251f1a62e583ec349c3f5ed1c236d8ea5e0f294bff44d45ff6" Mar 07 21:54:16.350601 master-0 kubenswrapper[16352]: I0307 21:54:16.350496 16352 scope.go:117] "RemoveContainer" containerID="5996fd8a2d2dec6a64eb982884f39b5b612393a00023e1419365ca6955728216" Mar 07 21:54:16.381495 master-0 kubenswrapper[16352]: I0307 21:54:16.381419 16352 scope.go:117] "RemoveContainer" containerID="e632e2d9f4f74bb8b395559e6a9867df4987eed2140f492aa24bc6165f5f4701" Mar 07 21:54:23.079308 master-0 kubenswrapper[16352]: I0307 21:54:23.079164 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-c5sqt"] Mar 07 21:54:23.092125 master-0 kubenswrapper[16352]: I0307 21:54:23.092020 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-c5sqt"] Mar 07 21:54:23.205409 master-0 kubenswrapper[16352]: I0307 21:54:23.205338 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd475e91-463a-4538-84af-ba2b678d7f06" path="/var/lib/kubelet/pods/bd475e91-463a-4538-84af-ba2b678d7f06/volumes" Mar 07 21:54:25.054442 master-0 kubenswrapper[16352]: I0307 21:54:25.054319 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2rz24"] Mar 07 21:54:25.068515 master-0 kubenswrapper[16352]: I0307 21:54:25.068431 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-2rz24"] Mar 07 21:54:25.214570 master-0 kubenswrapper[16352]: I0307 21:54:25.214464 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a726ded1-f768-48e1-87c1-1b99262d45e1" path="/var/lib/kubelet/pods/a726ded1-f768-48e1-87c1-1b99262d45e1/volumes" Mar 07 21:55:01.085563 master-0 kubenswrapper[16352]: I0307 21:55:01.085456 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-host-discover-8g65x"] Mar 07 21:55:01.105609 master-0 kubenswrapper[16352]: I0307 21:55:01.105512 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-host-discover-8g65x"] Mar 07 21:55:01.215821 master-0 kubenswrapper[16352]: I0307 21:55:01.215681 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20" path="/var/lib/kubelet/pods/2cdcd2d2-c8c0-40e0-8e36-8f9fe1af1b20/volumes" Mar 07 21:55:03.051840 master-0 kubenswrapper[16352]: I0307 21:55:03.051730 16352 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-8cwkr"] Mar 07 21:55:03.065720 master-0 kubenswrapper[16352]: I0307 21:55:03.065606 16352 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-8cwkr"] Mar 07 21:55:03.208918 master-0 kubenswrapper[16352]: I0307 21:55:03.208721 16352 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2efb703b-1fcb-4aef-b969-8b4afd7dc207" path="/var/lib/kubelet/pods/2efb703b-1fcb-4aef-b969-8b4afd7dc207/volumes" Mar 07 21:55:16.584005 master-0 kubenswrapper[16352]: I0307 21:55:16.583928 16352 scope.go:117] "RemoveContainer" containerID="0fcbcd3425ac07f254302b3f7f585f9e11789071a133657f62c3151499244b38" Mar 07 21:55:16.621946 master-0 kubenswrapper[16352]: I0307 21:55:16.621877 16352 scope.go:117] "RemoveContainer" containerID="41fa12277d21a6dfe6be14c59d8ba3f97290786dd8fbba3281f16d944c8c69da" Mar 07 21:55:16.675141 master-0 kubenswrapper[16352]: I0307 21:55:16.675042 16352 scope.go:117] "RemoveContainer" containerID="43d9dc91614e24b8ae37ce9d5e437173479c23869edf0df37c3e4904686c5776" Mar 07 21:55:16.712610 master-0 kubenswrapper[16352]: I0307 21:55:16.712563 16352 scope.go:117] "RemoveContainer" containerID="bc13e8beb36012f16bde41d8bcda1e30055c34fd50a42f60875a5404f6660ecf" Mar 07 22:01:00.180937 master-0 kubenswrapper[16352]: I0307 22:01:00.180781 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29548681-5hg8p"] Mar 07 22:01:00.183226 master-0 kubenswrapper[16352]: I0307 22:01:00.183174 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.253889 master-0 kubenswrapper[16352]: I0307 22:01:00.252909 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29548681-5hg8p"] Mar 07 22:01:00.270715 master-0 kubenswrapper[16352]: I0307 22:01:00.270593 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.271557 master-0 kubenswrapper[16352]: I0307 22:01:00.271512 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.271826 master-0 kubenswrapper[16352]: I0307 22:01:00.271789 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4j9v\" (UniqueName: \"kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.273450 master-0 kubenswrapper[16352]: I0307 22:01:00.272481 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.376368 master-0 kubenswrapper[16352]: I0307 22:01:00.376249 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.376816 master-0 kubenswrapper[16352]: I0307 22:01:00.376432 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.376816 master-0 kubenswrapper[16352]: I0307 22:01:00.376474 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4j9v\" (UniqueName: \"kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.376816 master-0 kubenswrapper[16352]: I0307 22:01:00.376569 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.380780 master-0 kubenswrapper[16352]: I0307 22:01:00.380648 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.383162 master-0 kubenswrapper[16352]: I0307 22:01:00.383073 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.387326 master-0 kubenswrapper[16352]: I0307 22:01:00.387272 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.395913 master-0 kubenswrapper[16352]: I0307 22:01:00.395854 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4j9v\" (UniqueName: \"kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v\") pod \"keystone-cron-29548681-5hg8p\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:00.508934 master-0 kubenswrapper[16352]: I0307 22:01:00.508642 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:01.024877 master-0 kubenswrapper[16352]: I0307 22:01:01.024796 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29548681-5hg8p"] Mar 07 22:01:01.611520 master-0 kubenswrapper[16352]: I0307 22:01:01.611283 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548681-5hg8p" event={"ID":"f78feced-985c-4119-8982-caf82ad84f0a","Type":"ContainerStarted","Data":"71b3d34e97c5f9f27fed81cc1a73e605ae8115f662a7f46ad7286cb8c9c74988"} Mar 07 22:01:01.611520 master-0 kubenswrapper[16352]: I0307 22:01:01.611448 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548681-5hg8p" event={"ID":"f78feced-985c-4119-8982-caf82ad84f0a","Type":"ContainerStarted","Data":"637b24880fc1e40cb89168fe20033dfecc6147e9359afe665c40c071fccfe9ed"} Mar 07 22:01:01.675150 master-0 kubenswrapper[16352]: I0307 22:01:01.675031 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29548681-5hg8p" podStartSLOduration=1.6749930050000001 podStartE2EDuration="1.674993005s" podCreationTimestamp="2026-03-07 22:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-07 22:01:01.648293036 +0000 UTC m=+2584.718998105" watchObservedRunningTime="2026-03-07 22:01:01.674993005 +0000 UTC m=+2584.745698074" Mar 07 22:01:03.660727 master-0 kubenswrapper[16352]: I0307 22:01:03.660641 16352 generic.go:334] "Generic (PLEG): container finished" podID="f78feced-985c-4119-8982-caf82ad84f0a" containerID="71b3d34e97c5f9f27fed81cc1a73e605ae8115f662a7f46ad7286cb8c9c74988" exitCode=0 Mar 07 22:01:03.660727 master-0 kubenswrapper[16352]: I0307 22:01:03.660720 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548681-5hg8p" event={"ID":"f78feced-985c-4119-8982-caf82ad84f0a","Type":"ContainerDied","Data":"71b3d34e97c5f9f27fed81cc1a73e605ae8115f662a7f46ad7286cb8c9c74988"} Mar 07 22:01:05.215329 master-0 kubenswrapper[16352]: I0307 22:01:05.215266 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:01:05.347559 master-0 kubenswrapper[16352]: I0307 22:01:05.347479 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4j9v\" (UniqueName: \"kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v\") pod \"f78feced-985c-4119-8982-caf82ad84f0a\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " Mar 07 22:01:05.347855 master-0 kubenswrapper[16352]: I0307 22:01:05.347621 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle\") pod \"f78feced-985c-4119-8982-caf82ad84f0a\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " Mar 07 22:01:05.347855 master-0 kubenswrapper[16352]: I0307 22:01:05.347817 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data\") pod \"f78feced-985c-4119-8982-caf82ad84f0a\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " Mar 07 22:01:05.347942 master-0 kubenswrapper[16352]: I0307 22:01:05.347886 16352 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys\") pod \"f78feced-985c-4119-8982-caf82ad84f0a\" (UID: \"f78feced-985c-4119-8982-caf82ad84f0a\") " Mar 07 22:01:05.351794 master-0 kubenswrapper[16352]: I0307 22:01:05.351717 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v" (OuterVolumeSpecName: "kube-api-access-c4j9v") pod "f78feced-985c-4119-8982-caf82ad84f0a" (UID: "f78feced-985c-4119-8982-caf82ad84f0a"). InnerVolumeSpecName "kube-api-access-c4j9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 07 22:01:05.353794 master-0 kubenswrapper[16352]: I0307 22:01:05.353726 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "f78feced-985c-4119-8982-caf82ad84f0a" (UID: "f78feced-985c-4119-8982-caf82ad84f0a"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 22:01:05.393194 master-0 kubenswrapper[16352]: I0307 22:01:05.393021 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f78feced-985c-4119-8982-caf82ad84f0a" (UID: "f78feced-985c-4119-8982-caf82ad84f0a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 22:01:05.423328 master-0 kubenswrapper[16352]: I0307 22:01:05.423249 16352 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data" (OuterVolumeSpecName: "config-data") pod "f78feced-985c-4119-8982-caf82ad84f0a" (UID: "f78feced-985c-4119-8982-caf82ad84f0a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 07 22:01:05.469959 master-0 kubenswrapper[16352]: I0307 22:01:05.469867 16352 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-config-data\") on node \"master-0\" DevicePath \"\"" Mar 07 22:01:05.469959 master-0 kubenswrapper[16352]: I0307 22:01:05.469951 16352 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-fernet-keys\") on node \"master-0\" DevicePath \"\"" Mar 07 22:01:05.469959 master-0 kubenswrapper[16352]: I0307 22:01:05.469967 16352 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4j9v\" (UniqueName: \"kubernetes.io/projected/f78feced-985c-4119-8982-caf82ad84f0a-kube-api-access-c4j9v\") on node \"master-0\" DevicePath \"\"" Mar 07 22:01:05.469959 master-0 kubenswrapper[16352]: I0307 22:01:05.469982 16352 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f78feced-985c-4119-8982-caf82ad84f0a-combined-ca-bundle\") on node \"master-0\" DevicePath \"\"" Mar 07 22:01:05.693437 master-0 kubenswrapper[16352]: I0307 22:01:05.693230 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29548681-5hg8p" event={"ID":"f78feced-985c-4119-8982-caf82ad84f0a","Type":"ContainerDied","Data":"637b24880fc1e40cb89168fe20033dfecc6147e9359afe665c40c071fccfe9ed"} Mar 07 22:01:05.693437 master-0 kubenswrapper[16352]: I0307 22:01:05.693298 16352 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="637b24880fc1e40cb89168fe20033dfecc6147e9359afe665c40c071fccfe9ed" Mar 07 22:01:05.693437 master-0 kubenswrapper[16352]: I0307 22:01:05.693356 16352 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29548681-5hg8p" Mar 07 22:02:39.610714 master-0 kubenswrapper[16352]: I0307 22:02:39.610490 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rnm26/must-gather-d9jdv"] Mar 07 22:02:39.611725 master-0 kubenswrapper[16352]: E0307 22:02:39.611504 16352 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f78feced-985c-4119-8982-caf82ad84f0a" containerName="keystone-cron" Mar 07 22:02:39.611725 master-0 kubenswrapper[16352]: I0307 22:02:39.611528 16352 state_mem.go:107] "Deleted CPUSet assignment" podUID="f78feced-985c-4119-8982-caf82ad84f0a" containerName="keystone-cron" Mar 07 22:02:39.618859 master-0 kubenswrapper[16352]: I0307 22:02:39.611947 16352 memory_manager.go:354] "RemoveStaleState removing state" podUID="f78feced-985c-4119-8982-caf82ad84f0a" containerName="keystone-cron" Mar 07 22:02:39.618859 master-0 kubenswrapper[16352]: I0307 22:02:39.613897 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.618859 master-0 kubenswrapper[16352]: I0307 22:02:39.618420 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rnm26"/"kube-root-ca.crt" Mar 07 22:02:39.622710 master-0 kubenswrapper[16352]: I0307 22:02:39.620761 16352 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-rnm26"/"openshift-service-ca.crt" Mar 07 22:02:39.633474 master-0 kubenswrapper[16352]: I0307 22:02:39.633370 16352 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-rnm26/must-gather-gqknr"] Mar 07 22:02:39.638272 master-0 kubenswrapper[16352]: I0307 22:02:39.638154 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.668614 master-0 kubenswrapper[16352]: I0307 22:02:39.668523 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df0562ff-4f1a-4a00-a1a6-561a2e664a40-must-gather-output\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.669235 master-0 kubenswrapper[16352]: I0307 22:02:39.669160 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-must-gather-output\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.669790 master-0 kubenswrapper[16352]: I0307 22:02:39.669482 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww2qc\" (UniqueName: \"kubernetes.io/projected/df0562ff-4f1a-4a00-a1a6-561a2e664a40-kube-api-access-ww2qc\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.669790 master-0 kubenswrapper[16352]: I0307 22:02:39.669582 16352 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7lzl\" (UniqueName: \"kubernetes.io/projected/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-kube-api-access-c7lzl\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.671102 master-0 kubenswrapper[16352]: I0307 22:02:39.671038 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rnm26/must-gather-gqknr"] Mar 07 22:02:39.722421 master-0 kubenswrapper[16352]: I0307 22:02:39.722318 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rnm26/must-gather-d9jdv"] Mar 07 22:02:39.773767 master-0 kubenswrapper[16352]: I0307 22:02:39.772842 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df0562ff-4f1a-4a00-a1a6-561a2e664a40-must-gather-output\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.774403 master-0 kubenswrapper[16352]: I0307 22:02:39.774348 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/df0562ff-4f1a-4a00-a1a6-561a2e664a40-must-gather-output\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.774887 master-0 kubenswrapper[16352]: I0307 22:02:39.774835 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-must-gather-output\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.775090 master-0 kubenswrapper[16352]: I0307 22:02:39.775073 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww2qc\" (UniqueName: \"kubernetes.io/projected/df0562ff-4f1a-4a00-a1a6-561a2e664a40-kube-api-access-ww2qc\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:39.775240 master-0 kubenswrapper[16352]: I0307 22:02:39.775223 16352 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7lzl\" (UniqueName: \"kubernetes.io/projected/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-kube-api-access-c7lzl\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.776133 master-0 kubenswrapper[16352]: I0307 22:02:39.776070 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-must-gather-output\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.811729 master-0 kubenswrapper[16352]: I0307 22:02:39.807304 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7lzl\" (UniqueName: \"kubernetes.io/projected/cf4f39ec-89dc-429f-b70b-b4114ae2f7c6-kube-api-access-c7lzl\") pod \"must-gather-gqknr\" (UID: \"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6\") " pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:39.811729 master-0 kubenswrapper[16352]: I0307 22:02:39.810469 16352 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww2qc\" (UniqueName: \"kubernetes.io/projected/df0562ff-4f1a-4a00-a1a6-561a2e664a40-kube-api-access-ww2qc\") pod \"must-gather-d9jdv\" (UID: \"df0562ff-4f1a-4a00-a1a6-561a2e664a40\") " pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:40.022974 master-0 kubenswrapper[16352]: I0307 22:02:40.022809 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rnm26/must-gather-d9jdv" Mar 07 22:02:40.038246 master-0 kubenswrapper[16352]: I0307 22:02:40.038184 16352 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-rnm26/must-gather-gqknr" Mar 07 22:02:40.627805 master-0 kubenswrapper[16352]: I0307 22:02:40.627654 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rnm26/must-gather-d9jdv"] Mar 07 22:02:40.631820 master-0 kubenswrapper[16352]: I0307 22:02:40.631615 16352 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 07 22:02:40.762468 master-0 kubenswrapper[16352]: I0307 22:02:40.762099 16352 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-rnm26/must-gather-gqknr"] Mar 07 22:02:40.767437 master-0 kubenswrapper[16352]: W0307 22:02:40.767371 16352 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf4f39ec_89dc_429f_b70b_b4114ae2f7c6.slice/crio-8672db976949718cce883d652c6cbefe6fca7e019475270b635739aec88d30a9 WatchSource:0}: Error finding container 8672db976949718cce883d652c6cbefe6fca7e019475270b635739aec88d30a9: Status 404 returned error can't find the container with id 8672db976949718cce883d652c6cbefe6fca7e019475270b635739aec88d30a9 Mar 07 22:02:41.280830 master-0 kubenswrapper[16352]: I0307 22:02:41.280630 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rnm26/must-gather-d9jdv" event={"ID":"df0562ff-4f1a-4a00-a1a6-561a2e664a40","Type":"ContainerStarted","Data":"ea4170d66ae0ad0cc969f17b86e38ef29eb8da24356294069776f2b9626ad6ad"} Mar 07 22:02:41.285616 master-0 kubenswrapper[16352]: I0307 22:02:41.285572 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rnm26/must-gather-gqknr" event={"ID":"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6","Type":"ContainerStarted","Data":"8672db976949718cce883d652c6cbefe6fca7e019475270b635739aec88d30a9"} Mar 07 22:02:43.315758 master-0 kubenswrapper[16352]: I0307 22:02:43.315631 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rnm26/must-gather-gqknr" event={"ID":"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6","Type":"ContainerStarted","Data":"5aeda3dd0952de32ffe8dd911e6941a443f62ce2e35bc351b507fc20cf952412"} Mar 07 22:02:43.315758 master-0 kubenswrapper[16352]: I0307 22:02:43.315740 16352 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-rnm26/must-gather-gqknr" event={"ID":"cf4f39ec-89dc-429f-b70b-b4114ae2f7c6","Type":"ContainerStarted","Data":"f3e52d8c7b6831f42c91cbb76d811f2f69179b5cced646cb47211dc8ee99a989"} Mar 07 22:02:43.386715 master-0 kubenswrapper[16352]: I0307 22:02:43.378054 16352 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-rnm26/must-gather-gqknr" podStartSLOduration=3.051475241 podStartE2EDuration="4.378028995s" podCreationTimestamp="2026-03-07 22:02:39 +0000 UTC" firstStartedPulling="2026-03-07 22:02:40.77226568 +0000 UTC m=+2683.842970739" lastFinishedPulling="2026-03-07 22:02:42.098819434 +0000 UTC m=+2685.169524493" observedRunningTime="2026-03-07 22:02:43.36528051 +0000 UTC m=+2686.435985569" watchObservedRunningTime="2026-03-07 22:02:43.378028995 +0000 UTC m=+2686.448734064" Mar 07 22:02:45.145199 master-0 kubenswrapper[16352]: I0307 22:02:45.145122 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-s44f4_96cfa9d3-fc26-42e9-8bac-ff2c25223654/cluster-version-operator/1.log" Mar 07 22:02:45.673954 master-0 kubenswrapper[16352]: I0307 22:02:45.673869 16352 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-version_cluster-version-operator-8c9c967c7-s44f4_96cfa9d3-fc26-42e9-8bac-ff2c25223654/cluster-version-operator/0.log"